jump to navigation

Extreme Visualizations – MoMA Exhibition March 17, 2008

Posted by newyorkscot in Science, Visualization / UX.
add a comment

I went to the Museum of Modern Art (MoMA) yesterday to see their new exhibition, Design and The Elastic Mind. This exhibition has a vast array of really cool concepts and visualizations that is absolutely fascinating. From “Extreme Visualizations” to “Mapping The Internet” to “Nanofracture”, they did a nice job of presenting how modern design and technology impacts our lives in many different ways. For example, the Nanofracture section demonstrated the advances in nanotechnology in terms of design of modern structures and materials, as well as the modeling of the human brain. They even had a design section of how origami can be used to model everything from DNA to a Fresnel Lens used in space telescopes. There were also various static and dynamic visualizations of global internet traffic & telephony around the world and in/out of New York (see pictures on New York Talk Exchange), as well as a model of all the flights in the sky across North America in a 24hr period. All in all, a very creative and insightful exhibit that beautifully marries design, art, engineering and computational power. Most of the exhibits are online at the URL above.

Really Massive Complex Event Processing February 20, 2008

Posted by newyorkscot in Complex Event Processing, HPC, Science.
1 comment so far

ScientificAmerican ran a feature this month on the Large Hadron Collider (LHC) being built by CERN to conduct  the largest physics experiments ever. Aside from its sheer physical scale, one of the remarkable aspects of the project is the massive volumes and frequency of data generated, causing it to be probably the most impressive combinations of complex event processing and distributed grid computing ever:

  • The LHC will accelerate 3000 bunches of 100 billion protons to the highest energies ever generated by a machine, colliding them head-on 30 million times a second, with each collision spewing out thousands of particles at nearly the speed of light.
  • There will be 600 million particle collisions every second. Each one called an “event”.
  • The millions of channels of data streaming away from the detector produce about a megabyte of data from each event: a petabyte, or a billion megabytes, of it every two seconds.
  • More details here and diagram here.

This massive amount of streaming data needs to be converged, filtered and then processed by a tiered grid network. Starting with a few thousand computers/blades at CERN (Tier 0), the data is routed (via dedicated optical cables) to 12 major institutions around the world (Tier 1), and then finally down to a number of smaller computing centers at universities and research institutes (Tier 2). Interestingly, the raw data coming off the LHC is saved onto magnetic tape (allegedly the most cost-effective and secure format).

I wonder how many nano-seconds they took to consider what CEP vendor they wanted to use for this project ?!!