Grid Computing in Lafayette

Zydetec had one of its new-style public seminars last night and it was covered in the Advertiser. They’re worth attending. The seminars naturally have their geeky side. (Hey, what is grid computing?) But the public rarely gets a chance to sit down and listen to industry leaders and local cognesenti talk about issues that will impact everyone’s future. I always find that listening to the conversation after a presentation the most informative part of a presentation and Lafayette’s Zydetech is one of the few places anywhere that anyone can walk in and join in the real nitty-gritty of the tech conversation. Avail yourself!

(Confession: I had to be at another meeting last night and missed the seminar. Any reader who wants to fill in the juicy details in the comments would be appreciated.)

The Advertiser does a fair job of reporting on the event itself. What is missing is the background that would let a reader know what grid computing is, why grid computing is important and why it is being discussed in Lafayette just now.

What grid computing is:
Wikipedia offers a technically oriented overview that’s pretty complete if you want the whole nuanced story.

But for the rest of us: Grid computing is a way to make massively effective supercomputers out the leftovers of everyone’s desktop computer. We look at our computers and see amazingly effective, plastic machines that can do an astonishing range of things. Network managers and uber tech types look at them and think: WASTE. All that processing power going to waste. Most of the time nobody is using any cycles and even when they are working 90% of the cycles are still going to waste. For the tidy-minded this is just silly. For those with real, unfilled compute-intensive needs it is offensive.

Not surprisingly these guys ran out and developed network models that would make unused cycles available to those who needed it.

Why Grid Computing is Important:
Well waste is bad. (Your grandparents told you so and they were right.) But beyond that there really are problems that are either too big to run in real-time on even the most expensive supercomptuer and big, real computational problems that nobody can afford to purchase time to solve. Those are the sorts of problems for which Grid Computing is most likely to be implemented to solve.

One of the virtues of grid computing is that the resulting supercomputer is super cheap. Cheap is good. Suddenly the little guy can do things that only those that could own “big iron” could do before. You’ve heard of render farms? Expensive. Exclusive. Anyone with cheap access to a muni-sized grid system could compete cheaply George Lucas. The most powerful supercomputers are built to model the weather–globally. We could use the same powerful methods to apply to, say Lafayette parish, and begin to get a handle on truly local weather. (Those summer thunderstorms in Louisiana that drench one field and leave another 300 yards away bone dry? We could understand that.) Trouble is, the finer-grained analysis uses almost as much compute power as the continental-level projections. We trouble ourselves to afford large scale predictions whose accuracy–say about hurricanes–would have astonished the best metereologist in our parent’s day. But we can’t afford to do the same locally. Grid computing could change that. (The list goes on…)

Why is Lafayette talking about this:
The big hangup with large-scale grid computing is bandwidth. There’s never enough of it. People don’t want to give it up and private providers, who profit off maximizing the difference between the bandwidth you buy and that which you actually use, don’t want to do anything to encourage a new, big drain on their resources that they don’t get paid for. All that is why grid computing is rare and why most small models are on public networks like those at universities who view local network usasge as something to maximize. (They’ve paid for it an want to use it fully–short of the point of congestion, of course.)

When Lafayette gets it big bandwidth 100 megs internal fiber-optic system from LUS most of us will have that bandwidth to burn. It won’t cost us anything noticeable to share our cycles and bandwidth with others. Maybe those who chose to do so could get a small rebate on their bill and/or cheap or free access to the computational power that the community has provided itself. The 100 megs would make the grid plenty quick enough for distributed computation. Everyone benifits and a major, cost-saving, unique, and radiacally disruptive tool is added to the resources of Lafayette residents and businesses.

Uniquely cheap services are the kind of thing that businesses travel to a locales to take advantage of–the food and music here would be a nice plus.

Worth thinking about, no?

7 thoughts on “Grid Computing in Lafayette”

  1. As chairman of Zydetech let me just say this is one of the most concise and easy to understand explanations of Grid computing I’ve seen. What we are trying to do with the new Zydetech is deploy a Master Plan to facilitate all of the IT initiatives underway in Lafayette. Grid computing is a way to stop the brain drain of our talented Computer Science graduates and allow small business access to a supercomputing platform. Large scale application design and research sometimes require computational power that takes days without a supercomputer. A Grid implementation allows “big idea” folks and inventors access to supercomputing to accelerate their time to market. So by giving them access to a Grid implementation we can keep them here and hey maybe create some new jobs and diversify the local economy. That’s the end game. FTTH is what makes the difference because of the powerful bandwidth capability. Without it you can and will experience a bottleneck and much slower performance trying to deploy Grid over copper. Jeff LeBlanc, Proud Chairman of Zydetech, the Technology Committee of the Greater Lafayette Chamber of Commerce. Nous Allons

  2. I agree with Jeff– that was a great summary of grid computing.

    I thought you might be interested to know that grid computing is already being put to use on hurricane simulations in the gulf coast. I can’t go into details, but I know customers who are using my company’s PC-based grid software for hurricane simulation. So that can have some direct benefits for your area!

    Good luck on your efforts to get fiber into homes!

  3. Thanks Jeff, Dan,

    Grid computing is a nice chunk of what will be enabled when we build a community infrastructure. Watching that, and related ways to share resources, play out will be fascinating.

    Thanks, Dan, for the heads-up on hurricanes. I know there are people out there working on understanding microcells in hurricanes and others storms. It would be good to be able to understand that better. I’m hoping for someone to do some intensive work on building a model that would accurately predict storm surge. LSU is working on it but it’s pretty granular and keeping up with real-time changes is still an issue.