there are so many cores

Just another WordPress.com site

Chickens and eggs

Today is the first day I’ve been able to really do any work in two weeks. I’m finally adding image support for the gather operation. I was able to take a walk in the sun instead of hanging around in the hospital with a laptop and books (mostly sitting on a folding chair next to a red biohazard box and anti-bacterial hand sanitizer).

This should be obvious. New technology often has a nascent presence. For example: before iTunes, there was Napster – the innovation was a new commercial market for digital media.

Everything with GPGPU involves machine learning. That includes SETI/Folding@home. There’s a big combinatorial space to search for good stuff. That’s what data mining is about: take this raw data and refine value out of it as some processed or finished product or service.

These problems are not what we learn to do in university as part of the usual CS curriculum. All of our technology and applications are designed around the von Neumann machine.

I think that applications and platforms are a chicken and egg problem. If you build a platform that does new things before there are applications, it fails because no one needs it. If you have applications, then you build platforms for these old problems. Nothing new happens.

There must be bridging technology. That’s true even for something like Java. It started out as an embedded platform (Oak). Then it became thin clients (sandboxed applet), moved to the server back-end (enterprise IT), and now it is back on clients (Android). It was flexible enough to have utility for different markets.

Something like this must happen if GPGPU is to become more significant.

I gave a ten minute book review at a meetup. It was hard to concentrate in the hospital so I mostly read Rob Farber’s book a few times. Then I wrote up a slide deck and practiced a few times before the meetup. This went over quite well which surprised me.

Machine learning is a pervasive theme in Rob Farber’s book. It seems Rob Farber also perceives statistical learning theory as the egg that comes before the chickens.

I don’t have a PhD in statistics. I don’t even have a PhD. I dropped out. Twice. And I’ve only had one job with machine learning as a support quant. So I have to rely on intuition more than others.

I wonder if we haven’t seen the good GPGPU ML applications yet. My (very limited) real world experience is that ML and GPGPU meet just about when diminishing returns have set in. Sensible use of statistical methods, heuristic scoring functions, and map-reduce techniques are sufficient to pick the low hanging fruit. This could mean two things.

  1. Machine learning applications do not need GPGPU.
  2. GPGPU will allow new machine learning applications.

So if we try to use GPGPU for current applications, we do not realize much gain. The space of applications is what we conceive as tractable and practical for available technology. When new technology is invented, it may not yield much advantage for old problems.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: