Categories
Education

Secrets of the Universe — Coming to an IMAX screen near you in 2016

I’m thrilled to be part of a new NSF-funded IMAX and digital 3D documentary film project that will introduce audiences to the major scientific instruments being used to explore the origins of the universe. Chief among these are the Large Hadron Collider at CERN and a new generation of supercomputers.
My specific contribution is advisory and relates to the role of supercomputing in this scientific enterprise. Systems like Mira accelerate discoveries in the cosmology arena through large-scale scientific simulation and visualization of enormously complex physical phenomena. (Both simulation and visualization will be featured prominently in the film.) Supercomputers were recently used to generate the largest cosmology simulation ever, which will help the scientific community to test theories against observational data, such as the next-generation of sky surveys preparing to go online.
Filming will take place during 2013 and 2014 and will result in a 2D/3D giant screen film, a dome planetarium film, museum exhibits and other educational materials. It’s a great team of investigators that include media communications guru Mark Kresser, UC Davis physics professor Manuel Calderon de la Barca, Franklin Institute’s Dale McCreedy, and IMAX film director Stephen Low.
A particularly interesting aspect of this outreach project is that it also supports a study of middle school girls’ interest and engagement in the topic. (Middle school girls’ interest in science and math tends to plummet around this time due to several social factors.) Films like these are high-quality outreach projects that present complex scientific research to the public in an accessible and entertaining way. It’s also an excellent opportunity to develop insights into how to develop STEM content for an especially vulnerable group of learners.

Categories
Argonne Leadership Computing Facility Research

Accelerating the discovery of alternative fuel sources

In many ways, biofuel research is like modern day alchemy. The transmutation of biomass materials — which includes anything from kitchen and latrine waste to stalky, non-edible plants — into a sustainable and renewable energy source involves catalysts and chemical reactions. The process promises to help meet the world’s critical energy challenges.
Biofuel research can also be thought of as the ultimate multi-scale, multi-physics research problem. It represents several interesting biological supply-chain management problems. Not surprisingly, biofuel research spans several domains here at Argonne, and takes place in wet labs and joint institutes across the lab campus. There is also an exciting INCITE research project going on in the ALCF aimed at finding a more effective way to convert plant materials that contain cellulose, such as wood chips and switchgrass, into sugars, and then converted into biofuels.
A science team from the National Renewable Energy Laboratory is using Mira to conduct large-scale simulations of the complex cellulose-to-sugar conversion process. Researchers are able to obtain data, such as the level of an enzyme’s binding free energy, which is difficult to obtain through conventional experimental approaches, helping to accelerate the process of screening and testing new enzymes. With such information, researchers will be able to identify potential enzyme modifications and then feed their discoveries into experiments aimed at developing and validating improved catalysts. Read the full research highlight here.

Categories
Education

The beginning of a change

After posting a few months back on the exciting STEAM work at RISD, and the push to integrate art into STEM (science, technology, engineering, and mathematics) curricula, I was eager to attend the “The Art of Science Learning” talk at Argonne this week, where I learned about a National Science Foundation program with similar goals.
The Art of Science Learning is a national initiative that uses the arts to spark creativity in science education. The goal of the project’s development activities is to experiment with a variety of “innovation incubator” models in cities around the country: one in San Diego (hosted by Balboa Park Cultural Partnership), one in Chicago (hosted by the Museum of Science and Industry), and one in Worcester, Mass (hosted by the EcoTarium). These incubators generate collaborations of different professionals and the public around STEM education and other STEM-related topics of local interest that can be explored with the help of creative learning methodologies.
Chicago incubator director Tim Morrison spoke about this initiative, and the yearlong effort starting this January to address the STEM challenges of urban nutrition.
Projects like these are aimed at exploring a framework to ultimately change the way children are educated in the U.S. — one that emphasizes creativity and innovation as a means to build a strong economy. I, for one, am extremely encouraged to see this movement gathering steam. Pun intended.

Categories
Uncategorized

Spotlight on leadership-class computing

I’m pleased to announce that Jim Hack and I will be co-editing a special issue of Computing in Science & Engineering magazine on the topic of leadership computing, to be published in fall 2014. The goal of this issue is to explore how leadership computing is being effectively used to support real-world science and engineering applications. The topics of interest and submission guidelines can be found on the CiSE website.

Categories
Musings

Thanks, Reddit Users

I had a great time hosting an “Ask Me Anything” subreddit last Monday. I spent over two hours answering questions posted by Reddit users; mostly about the ALCF, how supercomputing is used in research, and where I think computing is heading. I didn’t know what to expect when I agreed to host this, but it was a lot of fun! Thanks for all the great questions. I’ll try to get to more of them in the next couple of weeks.

Categories
Argonne Leadership Computing Facility Education

Extreme-Scale Computing Training Course: Class of 2013!

Today I met with the first class of young scientists and researchers to participate in the Argonne Training Program on Extreme-Scale Computing. The trainees are now into week two of lectures and hands-on sessions aimed at teaching them how to program massively parallel supercomputers. I chaired the afternoon session on data visualization and analysis, complete with a set up success stories, taught by colleagues from ALCF, the University of Oregon/Lawrence Berkeley National Laboratory, and Kitware, Inc.
Training course organizers tell me this is an enthusiastic and motivated group; many of the participants remain long after the lectures end to engage the speakers on topics ranging from the latest performance tools to debugging to data analysis. Last week the group got special access to Argonne’s leadership computing resources, including Mira. This week they got a similar opportunity to experiment with application runs on Oak Ridge National Laboratory’s leadership system, Titan.
By funding this training course, the DOE is helping expand the user community of today’s high-end systems, but more importantly they are helping prepare a new generation of computer and computational scientists to keep our national priorities on track.

Categories
Education Musings

From Lego bricks to C++: promoting computational thinking skills in U.S. schools

TEALS (Technology Education And Literacy in Schools) is a nonprofit tech-literacy teaching program that grew out of Microsoft engineer Kevin Wang’s desire to teach computer science in his spare time. Microsoft liked it, funded it, and made Wang its chief promoter. That was in 2009. For the 2013-2014 academic year, Wang just placed 250 volunteers in 65 schools in 12 states.
The TEALS program recruits, mentors, and places high tech professionals into high school classrooms to teach computer science courses. These volunteers team teach with in-service teachers, who eventually assume full responsibility for the coursework. The goal is to give schools and school districts a computer science curriculum to self manage and grow into a sustainable computer science program.
The tech industry has been clamoring for years about the dearth of programming talent in the United States. A program like TEALS, which attempts to fill a gap where there is insufficient government investment and interest in basic computer science education, is a good start. TEALS provides top tech talent to school districts unable to provide computer science curriculum, but the program’s reach would have to scale up drastically in coming years if the industry hopes to see any impact in U.S. competitiveness in this area.
If an interest in computer science is to take seed in a student, basic computational concepts must be introduced much sooner than high school. Computational thinking is integrative. Learning to program is like learning to spell — it’s a skill that makes many other areas of knowledge accessible.
It’s time to develop pedagogical models to enable all U.S. students to graduate high school with a functional understanding of computer science. I would advocate for building up problem solving skills in lower grades, mastering basic programming exercises in middle school, and full immersion in computing languages by high school. The sooner we can get parents, teachers, and administrators to see the value of providing the foundation our kids need to get excited about computer science, the sooner we’ll start effecting real change in our ability to compete.

Categories
Art

Exploring energy themes artistically

This summer Argonne is hosting ART ENERGY FUTURE, a collaborative contemporary art exhibit curated in connection with a 2012 United Nations global awareness campaign about the importance of increasing sustainable access to energy, energy efficiency, and renewable energy. The traveling exhibit debuted last summer at the Turin Museum of Natural History and will be on display at Argonne through September 13.
ART ENERGY FUTURE, curated by Chicago gallery owner and artist Sergio Gomez, features artworks by Italian and American artists, which include Argonne computer scientist Mark Hereld. Mark, a member of the ALCF research staff, directs science teams in the analysis and visualization of their computer simulation data, and is well versed in the collaborative exchange between artistic practices and scientific investigations. As an artist, Mark works with various media and collaborators and recently designed the large-scale mural “skin” for ALCF’s new petascale supercomputer that showcases the computational science going on at the lab.
The summer art show is the first in a series of engaging exhibits that Argonne will rotate through the lobby of its main administrative building for the enjoyment of its employees and guests.

Categories
Argonne Leadership Computing Facility

Expanding the community, accelerating mission-critical research

Summertime, specifically July 1, is when the ASCR Leadership Computing Challenge (ALCC) projects get underway at the Leadership Computing Facilities at Argonne and Oak Ridge, and at the National Energy Research Scientific Computing Center (NERSC). The supercomputing centers will support a total of 32 projects and 1.6 billion core-hours — 809 million core-hours to 13 new projects at the ALCF alone.
ALCC projects expand into new areas of science and engineering of interest to the DOE mission — the “high-risk, high-payoff” research aimed at, among other things, national emergency mitigation — and also serves to grow a critical demographic: the community of researchers capable of using leadership computing resources.
This is the first year that ALCC projects will gain access to Argonne’s Mira system, which will greatly accelerate the target research in clean energy, climate change prediction, and battery research. More information about the individual 2013 ALCC awards can be found here.

Categories
Research

Parallel GPGPU application takes aim at tumors

In order to precisely reconstruct images of a tumor, a proton CT scan must pinpoint the exact location where an individual proton enters and exits the human body. A new generation of particle detectors uses tiny scintillating fibers (photo) in combination with ultrasensitive photomultipliers made of silicon to detect a proton's path. Photo: Reidar Hahn

Protons, specifically proton beams, are increasingly being used to treat cancer with more precision. To plan for proton treatment, X-ray computed tomography (X-ray CT) is typically used to produce an image of the tumor site — a process that involves bombarding the target with photon particles, measuring their energy loss and position, and then using projection methods to establish the 3D shape of the target.
A new imaging method, which employs protons instead of photons, promises to deliver more accurate images while subjecting the patient to a lower dose of radiation. Proton computed tomography (pCT) employs billions of protons and multiple computationally intensive processes to reconstruct a 3D image of the tumor site. To achieve the required accuracy would take a long time on a single computer, and it’s not clinically feasible to require a patient to sit still for a long period to be imaged, or to wait a day for the images to be produced.
A group of computer scientists at Argonne and at Northern Illinois University has been working to accelerate pCT imaging using parallel and heterogeneous high performance computing techniques. The team so far has developed the code and tested it on GPU clusters at NIU and at Argonne with astounding results — producing highly accurate 3D reconstructions from proton CT data in less than ten minutes.
Image Credit: FermiLab