• http://www.lfgss.com/comments/12754992/

    It is called connecting the dots. The point I was trying to make is that their control is not a typical decision tree as many might expect and that they are not just programmed with scenarios. AI is no longer logic and symbols but statistics, control and black-box tools-- I call them black box since we, at this time, don't quite understand what (features) the systems are really learning only that they seem to be doing a very good job. Although symbollic AI-- explicitly building things with facts and rules (anyone remember the big wave of expert systems back in the 1980s)-- is also making a bit of a renaissance at the edges of cognitive science it is doing so in quite a different way with the addition of dynamics. The big revolution right now is just taking the connectionist approach kicked off in the late 1980s and early 1990s with now the data, hardware (GPUs) and tools (especially to use those GPUs) to make it happen-- remember TM's CM-1 had a max of 65K compute cores and their last machine maxed at 1024 SuperSPARC processors and as a friend commented "Getting a program to run on a CM was sufficient for a Phd at Princeton".)

    Here is an old paper by Minsky
    http://web.media.mit.edu/~minsky/papers/SymbolicVs.Connectionist.html that is still a pretty relevant backgrounder..

    When Google got into the game they started off they used a massive number of computers and CPUs. That was back in 2011/2. The big "bang" came the next year when they went from using CPUs to GPUs. Using 3 machines, each with consumer quadcore processors and cheap GPU cards they were able to duplicate their work
    See http://jmlr.org/proceedings/papers/v28/coates13.pdf

    Instead of using borrowed graphic cards we now have GPGPUs-- GPUs that are designed for use as general purpose computational processors. They have better bandwidth, more memory and cores. The Tesla K80, for example, has 24 GB RAM, 4992 cores and 48 GB/s bandwidth.

    The current state of the art is now hybrid systems: CPUs and GPUs. The first major hybrid was Oakridge's Titan: 18K GPUs (Tesla K20s) and 18K 16-core CPUs (Opteron).

    Even the little Tegra X1 that one can find in a phone/tablet (right now Google's reference Pixel C) offers 256 CUDA cores, 8 CPU cores.

About

Avatar for EdwardZ @EdwardZ started