I read Hillis' book on the Connection Machine, back when he first published it.
I was studying AI at university at the time, and had just discovered the back-propagation method for training neural nets. Thought CM might be good at that.
But any burly dot-product accelerator would be way more appropriate; that's why GPUs are a good fit for training and inference.
Connection Machine might be ideal for a Gelertner tuple-space memory.
Later, I worked at a biotech company that actually had a real Connection Machine; they used it as a huge sliding-window pattern matcher, to search patent filings for prior art. !! Searching for gene sequences in published literature. Holy moly... These days, a CPU would be fast enough; memory sizes are much greater now.
I was studying AI at university at the time, and had just discovered the back-propagation method for training neural nets. Thought CM might be good at that.
But any burly dot-product accelerator would be way more appropriate; that's why GPUs are a good fit for training and inference.
Connection Machine might be ideal for a Gelertner tuple-space memory.
Later, I worked at a biotech company that actually had a real Connection Machine; they used it as a huge sliding-window pattern matcher, to search patent filings for prior art. !! Searching for gene sequences in published literature. Holy moly... These days, a CPU would be fast enough; memory sizes are much greater now.