Harmony Central Forums
Announcement
Collapse
No announcement yet.

Mr. Knobs: Eugene Goostman has a guinea pig, so I guess you're wrong about AI

Collapse



X
  • Time
  • Show
Clear All
new posts

  • #16
    Turing or no, it's remarkable how far AI has progressed.

    And while there's surely amount of people who aren't real bright - one look at the average comments on YouTube bears this out - writing off an entire generation or two of people as dumber smacks of some octogenenarian in baggy suspenders yelling, "Damn kids! Get off my lawn!"
    Last edited by UstadKhanAli; 06-21-2014, 10:22 PM.
    Ken Lee on 500px / Ken's Photo Store / Ken Lee Photography Facebook Website / Blueberry Buddha Studios / Ajanta Palace Houseboat - Kashmir / Hotel Green View - Kashmir / Eleven Shadows website / Ken Lee Photography Blog / Akai 12-track tape transfers / MY NEW ALBUM! The Mercury Seven

    Comment


    • #17
      people have an incredibly sophisticated sense of evaluation developed from millions of years of trying to stay alive.

      just think about how little has to be askew in a living person to drive us to reject them.

      we walk down the street, we evaluate a stranger and decide whether they're a friend or threat in a primal and complex way.

      we take it for granted, but I think we tap into that same primal sense when evaluating a virtual person.
      Last edited by Goober(s); 06-21-2014, 09:03 PM.

      Comment


      • #18
        Originally posted by UstadKhanAli View Post
        Turing or no, it's remarkable how far AI has progressed.

        And while there's surely amount of people who aren't real bright - one look at the average comments on YouTube bears this out - writing off an entire generation or two of people as dumber smacks of some octogenenarian in baggy suspenders yelling, "Damn kids! Get off my lawn!"
        Well, the "made people dumber" was sort of a joke.

        The rest is not, though. AI is one of the major disappointments in Computer Science, the other being parallel processing. When I was studying computer science in college there was all sorts of arm wavey, pie in the sky talk about these two areas, how they would change the future. Neither has come to fruition, both are still sorely needed.

        For AI we have clever programs that appear to posess some intelligence but are entirely predictible (though somewhat more difficult to predict than in 1976), and for any meaningful parallel processing we still have to have a person write the program to work in a parallel fashion (i.e. gigantic matrix inversion by parts, circa 1970) vs computer software figuring out which parts of a process are parallelizable and doing that automatically.

        This latter is extremely important as we seem currently stuck with processors that top out around 3 GHz. And yes, I know that multicore processors are helpful when running multiple applications at once.

        Terry D.
        Last edited by MrKnobs; 06-25-2014, 02:55 PM. Reason: Some day I'l discover how to use spell check on here
        Telling Stories releases 2nd CD, see our WEBSITE! Please check out my GROUPIE STORY and Tales from the Road.

        Comment


        • #19
          Sorry. I hear comments like this so often from people on this forum about younger people being dumber that I can't tell whether anyone is joking anymore.
          Ken Lee on 500px / Ken's Photo Store / Ken Lee Photography Facebook Website / Blueberry Buddha Studios / Ajanta Palace Houseboat - Kashmir / Hotel Green View - Kashmir / Eleven Shadows website / Ken Lee Photography Blog / Akai 12-track tape transfers / MY NEW ALBUM! The Mercury Seven

          Comment


          • #20
            Originally posted by UstadKhanAli View Post
            Sorry. I hear comments like this so often from people on this forum about younger people being dumber that I can't tell whether anyone is joking anymore.
            After teaching bright young grad students for years I'd never make that statement!

            I do think our culture is dumbing down a bit, though.

            Terry D.
            Telling Stories releases 2nd CD, see our WEBSITE! Please check out my GROUPIE STORY and Tales from the Road.

            Comment


            • #21
              Originally posted by MrKnobs View Post
              ... for any meaningful parallel processing we still have to have a person write the program to work in a parallel fashion (i.e. gigantic matrix inversion by parts, circa 1970) vs computer software figuring out which parts of a process are parallelizable and doing that automatically.

              This latter is extremely important as we seem currently stuck with processors that top out around 3 GHz. And yes, I know that multicore processors are helpful when running multiple applications at once.
              I bet you'd be surprised how many applications are multithreaded these days. Rather than take the "parallel from the get-go" approach to thinking about and building parallel algorithms, we've taken a more circuitous (but far more practical) path of starting with mere timesharing for most programs, while embedded programmers (like me) worked on multitasking using real-time operating systems. The synthesis of the two disciplines lead to the "threaded" model for UNIX applications, where we could write a single program but split it into mutliple processes that could run concurrently (usually, sharing the same processor), but coordinating with each other and sharing the same memory.

              Later, multi-core and multi-processor systems added the ability to farm separate threads onto different cores or processors. Our DAWs all do that today, thank goodness, or it'd take a lot more processor speed to do what we do (for those of us who use virtual instruments or process-heavy effects, and lots of tracks).

              Now, it would be cool to completely rethink how we program computers and design new languages that took advantage of all the parallelism that's possible. For example, if I write "x = a*b + c*d" it's easy to see that a*b could be calculated in parallel with c*d. But as it turns out, not only does this raise a lot of technical challenges (at the language definition level, programming practice level, operating system level, and computer architecture), it doesn't give us much benefit until we have thousands and thousands of processors.

              We'll get there. The main reason we're not exploiting parallelism as much as possible is that we'd get diminishing returns. We know what the problems are and academics have presented a lot of ideas on practical approaches.

              So far, only a few applications benefit from massive parallelism. Graphics processors are a great case in point: it's an application that needs it, and it's how they work! It works so well that guys who do massively parallel math use the GPU rather than the CPU. (For example, bitcoin mining and codebreaking.)

              Other massively parallel applications include weather and climate models, and stuff like SETI (search for terrestrial intelligence). Some of these take advantage of all those idle home computers out there: you can sign up and allow your PC to get used for science, so problems get worked out using many thousands of volunteer computers.

              And the latest deal in parallelism is "The Cloud" that we've all been hearing about and using. All this virtualization is a (gosh darn complex) way of setting up lots and lots of identical resources that can be used by whatever needs them at the time, which provides incredible flexibility and lower cost. What's cool about this approach is that it allows you to use one thing as many or many things as one -- it doesn't care! It's like loading trains using liquids rather than lots of oddly sized boxes. Let the stuff flow and fill the available space.
              learjeff.net

              Comment













              Working...
              X