Science on the TeraGrid
Katz Daniel S. 1,2,3, Callaghan Scott A. 4, Harkness Robert 5, Jha Shantenu 1,6,7, Kurowski Krzysztof 8, Manos Steven 9*, Pamidighantam Sudhakar 10, Pierce Marlon E. 11, Plale Beth 11,12, Song X. Carol 13, Towns John 10
1Computation Institute, University of Chicago and Argonne National Laboratory, USA
2Center for Computation & Technology, Louisiana State University, USA
3Department of Electrical and Computer Engineering, Louisiana State University, USA
4University of Southern California, USA
5San Diego Supercomputer Center, University of California San Diego, USA
6Department of Computer Science, Louisiana State University, USA
7e-Science Institute, University of Edinburgh, UK
8Poznan Supercomputing and Networking Center, Poland
9Information Technology Services, University of Melbourne, Australia
10National Center for Supercomputer Applications, University of Illinois, USA,
11Pervasive Technology Institute, Indiana University Bloomington, USA
12School of Informatics and Computing, Indiana University Bloomington, USA
13Rosen Center for Advanced Computing, Purdue University, USA
Received:
Received: 21 May 2010; published online: revised: 29 October 2010; 23 November 2010
DOI: 10.12921/cmst.2010.SI.01.81-97
OAI: oai:lib.psnc.pl:689
Abstract:
The TeraGrid is an advanced, integrated, nationally-distributed, open, user-driven, US cyberinfrastructure that enables and supports leading edge scientific discovery and promotes science and technology education. It comprises supercomputing resources, storage systems, visualization resources, data collections, software, and science gateways, integrated by software systems and high bandwidth networks, coordinated through common policies and operations, and supported by technology experts. This paper discusses the TeraGrid itself, examples of the science that is occurring on the TeraGrid today, and applications that are being developed to perform science in the future.
Key words:
computational science applications, grid computing, high performance computing, production grid infrastructure
References:
[1] K.K. Droegemeier, D. Gannon, D. Reed, B. Plale, J. Alameda, T. Baltzer, K. Brewster, R. Clark, B. Domenico, S. Graves, E. Joseph, D. Murray, R. Ramachandran, M. Ramamurthy, L. Ramakrishnan, J.A. Rushing, D. Weber, R. Wilhelmson, A. Wilson, M. Xue, S. Yalda, Service-oriented environments for dynamically interacting with mesoscale weather. Computing in Science and Engg. 7 (6), 12-29 (2005).
[2] B. Plale, D. Gannon, Y. Huang, G. Kandaswamy, S. Lee Pallickara, A. Slominski, Cooperating services for datadriven computational experimentation. Computing in Science and Engineering 7, 34-43 (2005).
[3] S. Shirasuna, A Dynamic Scientific Workflow System for the Web Services Architecture. PhD thesis, Indiana University, September 2007.
[4] S. Jensen, B. Plale, Schema-independent and schema-friendly scientific metadata management. In ESCIENCE ‘08: Proceedings of the 2008 Fourth IEEE International Conference
on e-Science, pages 428-429, Washington, DC, USA, 2008. IEEE Computer Society.
[5] G. Kandaswamy, L. Fang, Y. Huang, S. Shirasuna, S. Marru, D. Gannon, Building web services for scientific grid applications. IBM J. Res. Dev. 50 (2/3) 249-260 (2006).
[6] Ch. Herath, B. Plale, Streamow – programming model for data streaming in scientific workflows. In Proceedings of the 10th IEEE/ACM Int’l Symposium on Cluster, Cloud, and Grid Computing (CCGrid 2010) 2010.
[7] S. Callaghan, E. Deelman, D. Gunter, G. Juve, P. Maechling, Ch. Brooks, K. Vahi, K. Milner, R. Graves, E. Field, D. Okaya, T. Jordan, Scaling up workflow-based applications. J. Comput. System Sci., 2010 (Special issue on scientific workflows, in press).
[8] R. Graves, T. Jordan, S. Callaghan, E. Deelman, E. Field, G. Juve, C. Kesselman, P. Maechling, G. Mehta, K. Milner, D. Okaya, P. Small, K. Vahi, Cybershake: A physics-based seismic hazard model for southern California. Pure Applied Geophys., 2010 (accepted for publication).
[9] R. Dooley, K. Milfeld, Ch. Guiang, S. Pamidighantam, G. Allen, From proposal to production: Lessons learned developing the computational chemistry grid cyberinfrastructure. Journal of Grid Computing, 4 (2), 195-208 (2006).
[10] N. Wilkins-Diehr, D. Gannon, G. Klimeck, S. Oster, S. Pamidighantam, TeraGrid science gateways and their impact on science. Computer 41 (11), 32-41 (2008).
[11] S.K. Sadiq, M.D. Mazzeo, S.J Zasada, S. Manos, I. Stoica, C. V Gale, Simon J Watson, P. Kellam, S. Brew, P.V Coveney, Patient-specific simulation as a basis for clinical decision-making. Philosophical Transactions of the Royal Society A, 366 (1878), 3199-3219 (2008).
[12] M.D Mazzeo, P.V Coveney. Hemelb: A high performance parallel lattice-Boltzmann code for large scale fluid flow in complex geometries. Computer Physics Communications 178 (12), 894-914 (2008).
[13] R.S Saksena, B. Boghosian, L. Fazendeiro, O.A. Kenway, S. Manos, M.D. Mazzeo, S.K. Sadiq, J.L Suter, D. Wright, P. V Coveney, Real science at the petascale. Philos T R Soc A 367 (1897) 2557-2571 (2009).
[14] M.D Mazzeo, S Manos, P.V Coveney, In situ ray tracing and computational steering for interactive blood flow simulation. Computer Physics Communications 181, 355-370 (2010).
[15] N. Karonis, B. Toonen, I. Foster, A grid-enabled implementation of the message passing interface. Journal of Parallel and Distributed Computing (JPDC) 63 (5), 551-563 (2003).
[16] S. Manos, S. Zasada, P.V. Coveney. Life or Death Decisionmaking: The Medical Case for Large-scale, On-demand Grid Computing. CTWatch Quarterly Journal 4 (2), 35-45 (2008).
[17] K. Yoshimoto, P. Kovatch, P. Andrews, Co-scheduling with usersettable reservations. In D.G. Feitelson, E Frachtenberg, L Rudolph, U. Schwiegelshohn, editors, Job Scheduling Strategies for Parallel Processing 146-156, Springer Verlag, Lect. Notes Comput. Sci. 3834 (2005)
[18] J. MacLaren, M. Mc Keown, S. Pickles. Co-Allocation, Fault Tolerance and Grid Computing. In Proceedings of the UK e-Science All Hands Meeting 2006, 155-162 (2006).
[19] P.V. Coveney, R.S. Saksena, S.J. Zasada, M. McKeown, S. Pickles, The application hosting environment: Lightweight middleware for grid-based computational science. Computer Physics Communications 176 (6), 406-418 (2007).
[20] S. Gogineni, D. Braaten, Ch. Allen, J. Paden, T. Akins, P. Kanagaratnam, K. Jezek, G. Prescott, G. Jayaraman, V. Ramasami, C. Lewis, D. Dunson, Polar radar for ice sheet measurements (PRISM). Remote Sensing of Environment 111 (2-3), 204-211 (2007) (Remote Sensing of the Cryosphere Special Issue).
[21] Z. Guo, R. Singh, M. Pierce, Building the PolarGrid portal using web 2.0 and OpenSocial. In GCE ’09: Proceedings of the 5th Grid Computing Environments Workshop 1-8, New York, NY, USA, ACM (2009).
[22] J. Alameda, M. Christie, G. Fox, J. Futrelle, D. Gannon, M. Hategan, G. Kandaswamy, G. von Laszewski, M.A. Nacar, M. Pierce, E. Roberts, Ch. Severance, M. Thomas, The open grid computing environments collaboration: portlets and services for science gateways: Research articles. Concurr. Comput.: Pract. Exper. 19 (6), 921-942 (2007).
[23] R. Kalyanam, L. Zhao, T. Park, L. Biehl, C.X. Song, Enabling useroriented data access in a satellite data portal. In Proceedings of the 3rd International Workshop on Grid Computing Environment (2007).
[24] T. Goodale et al., A Simple API for Grid Applications (SAGA). http://www.ogf.org/documents/GFD.90.pdf.
[25] Shantenu Jha, Hartmut Kaiser, Yaakoub El Khamra, and OleWeidner. De- sign and implementation of network performance aware applications using saga and cactus. e-
Science and Grid Computing, International Conference on, 0:143{150, 2007.
[26] S. Jha, Y. El Khamra, H. Kaiser, O. Weidner, A. Merzky, Developing adaptive scientific applications with hard to predict runtime resource requirements. In Proceedings of TeraGrid 2008 Conference (2008).
[27] A. Luckow, S. Jha, A. Merzky, B. Schnor, J. Kim, Reliable Replica Exchange Molecular Dynamics Simulation in the Grid using SAGA CPR and Migol. In Proceedings of UK e-Science 2008 All Hands Meeting, Edinburgh, UK (2008).
[28] Y. El-Khamra, S. Jha, Developing autonomic distributed scientific applications: a case study from history matching using ensemble kalman-filters. In Proceedings of the 6th International Conference on Autonomic Computing (ICAC); Industry session on Grids meets Autonomic Computing 19-28, New York, NY, USA (2009). ACM.
[29] A. Binczewski, N. Meyer, J. Nabrzyski, S. Starzak, M. Stroiński, J. Węglarz, First experiences with the polish optical internet. Comput. Netw. 37 (6):747-759 (2001).
[30] M. Kosiedowski, K. Kurowski, C. Mazurek, J. Nabrzyski, J. Pukacki, Workflow applications in gridlab and progress projects: Research articles. Concurr. Comput.: Pract. Exper. 18 (10), 1141-1154 (2006).
[31] K. Kurowski, W. Back, W. Dubitzky, L. Gulyás, G. Kampis, M. Mamonski, G. Szemes, M. Swain, Complex system simulations with qoscosgrid. In ICCS ‘09: Proceedings of the 9th International Conference on Computational Science, Berlin, Heidelberg, Springer-Verlag, 387-396 (2009).
The TeraGrid is an advanced, integrated, nationally-distributed, open, user-driven, US cyberinfrastructure that enables and supports leading edge scientific discovery and promotes science and technology education. It comprises supercomputing resources, storage systems, visualization resources, data collections, software, and science gateways, integrated by software systems and high bandwidth networks, coordinated through common policies and operations, and supported by technology experts. This paper discusses the TeraGrid itself, examples of the science that is occurring on the TeraGrid today, and applications that are being developed to perform science in the future.
Key words:
computational science applications, grid computing, high performance computing, production grid infrastructure
References:
[1] K.K. Droegemeier, D. Gannon, D. Reed, B. Plale, J. Alameda, T. Baltzer, K. Brewster, R. Clark, B. Domenico, S. Graves, E. Joseph, D. Murray, R. Ramachandran, M. Ramamurthy, L. Ramakrishnan, J.A. Rushing, D. Weber, R. Wilhelmson, A. Wilson, M. Xue, S. Yalda, Service-oriented environments for dynamically interacting with mesoscale weather. Computing in Science and Engg. 7 (6), 12-29 (2005).
[2] B. Plale, D. Gannon, Y. Huang, G. Kandaswamy, S. Lee Pallickara, A. Slominski, Cooperating services for datadriven computational experimentation. Computing in Science and Engineering 7, 34-43 (2005).
[3] S. Shirasuna, A Dynamic Scientific Workflow System for the Web Services Architecture. PhD thesis, Indiana University, September 2007.
[4] S. Jensen, B. Plale, Schema-independent and schema-friendly scientific metadata management. In ESCIENCE ‘08: Proceedings of the 2008 Fourth IEEE International Conference
on e-Science, pages 428-429, Washington, DC, USA, 2008. IEEE Computer Society.
[5] G. Kandaswamy, L. Fang, Y. Huang, S. Shirasuna, S. Marru, D. Gannon, Building web services for scientific grid applications. IBM J. Res. Dev. 50 (2/3) 249-260 (2006).
[6] Ch. Herath, B. Plale, Streamow – programming model for data streaming in scientific workflows. In Proceedings of the 10th IEEE/ACM Int’l Symposium on Cluster, Cloud, and Grid Computing (CCGrid 2010) 2010.
[7] S. Callaghan, E. Deelman, D. Gunter, G. Juve, P. Maechling, Ch. Brooks, K. Vahi, K. Milner, R. Graves, E. Field, D. Okaya, T. Jordan, Scaling up workflow-based applications. J. Comput. System Sci., 2010 (Special issue on scientific workflows, in press).
[8] R. Graves, T. Jordan, S. Callaghan, E. Deelman, E. Field, G. Juve, C. Kesselman, P. Maechling, G. Mehta, K. Milner, D. Okaya, P. Small, K. Vahi, Cybershake: A physics-based seismic hazard model for southern California. Pure Applied Geophys., 2010 (accepted for publication).
[9] R. Dooley, K. Milfeld, Ch. Guiang, S. Pamidighantam, G. Allen, From proposal to production: Lessons learned developing the computational chemistry grid cyberinfrastructure. Journal of Grid Computing, 4 (2), 195-208 (2006).
[10] N. Wilkins-Diehr, D. Gannon, G. Klimeck, S. Oster, S. Pamidighantam, TeraGrid science gateways and their impact on science. Computer 41 (11), 32-41 (2008).
[11] S.K. Sadiq, M.D. Mazzeo, S.J Zasada, S. Manos, I. Stoica, C. V Gale, Simon J Watson, P. Kellam, S. Brew, P.V Coveney, Patient-specific simulation as a basis for clinical decision-making. Philosophical Transactions of the Royal Society A, 366 (1878), 3199-3219 (2008).
[12] M.D Mazzeo, P.V Coveney. Hemelb: A high performance parallel lattice-Boltzmann code for large scale fluid flow in complex geometries. Computer Physics Communications 178 (12), 894-914 (2008).
[13] R.S Saksena, B. Boghosian, L. Fazendeiro, O.A. Kenway, S. Manos, M.D. Mazzeo, S.K. Sadiq, J.L Suter, D. Wright, P. V Coveney, Real science at the petascale. Philos T R Soc A 367 (1897) 2557-2571 (2009).
[14] M.D Mazzeo, S Manos, P.V Coveney, In situ ray tracing and computational steering for interactive blood flow simulation. Computer Physics Communications 181, 355-370 (2010).
[15] N. Karonis, B. Toonen, I. Foster, A grid-enabled implementation of the message passing interface. Journal of Parallel and Distributed Computing (JPDC) 63 (5), 551-563 (2003).
[16] S. Manos, S. Zasada, P.V. Coveney. Life or Death Decisionmaking: The Medical Case for Large-scale, On-demand Grid Computing. CTWatch Quarterly Journal 4 (2), 35-45 (2008).
[17] K. Yoshimoto, P. Kovatch, P. Andrews, Co-scheduling with usersettable reservations. In D.G. Feitelson, E Frachtenberg, L Rudolph, U. Schwiegelshohn, editors, Job Scheduling Strategies for Parallel Processing 146-156, Springer Verlag, Lect. Notes Comput. Sci. 3834 (2005)
[18] J. MacLaren, M. Mc Keown, S. Pickles. Co-Allocation, Fault Tolerance and Grid Computing. In Proceedings of the UK e-Science All Hands Meeting 2006, 155-162 (2006).
[19] P.V. Coveney, R.S. Saksena, S.J. Zasada, M. McKeown, S. Pickles, The application hosting environment: Lightweight middleware for grid-based computational science. Computer Physics Communications 176 (6), 406-418 (2007).
[20] S. Gogineni, D. Braaten, Ch. Allen, J. Paden, T. Akins, P. Kanagaratnam, K. Jezek, G. Prescott, G. Jayaraman, V. Ramasami, C. Lewis, D. Dunson, Polar radar for ice sheet measurements (PRISM). Remote Sensing of Environment 111 (2-3), 204-211 (2007) (Remote Sensing of the Cryosphere Special Issue).
[21] Z. Guo, R. Singh, M. Pierce, Building the PolarGrid portal using web 2.0 and OpenSocial. In GCE ’09: Proceedings of the 5th Grid Computing Environments Workshop 1-8, New York, NY, USA, ACM (2009).
[22] J. Alameda, M. Christie, G. Fox, J. Futrelle, D. Gannon, M. Hategan, G. Kandaswamy, G. von Laszewski, M.A. Nacar, M. Pierce, E. Roberts, Ch. Severance, M. Thomas, The open grid computing environments collaboration: portlets and services for science gateways: Research articles. Concurr. Comput.: Pract. Exper. 19 (6), 921-942 (2007).
[23] R. Kalyanam, L. Zhao, T. Park, L. Biehl, C.X. Song, Enabling useroriented data access in a satellite data portal. In Proceedings of the 3rd International Workshop on Grid Computing Environment (2007).
[24] T. Goodale et al., A Simple API for Grid Applications (SAGA). http://www.ogf.org/documents/GFD.90.pdf.
[25] Shantenu Jha, Hartmut Kaiser, Yaakoub El Khamra, and OleWeidner. De- sign and implementation of network performance aware applications using saga and cactus. e-
Science and Grid Computing, International Conference on, 0:143{150, 2007.
[26] S. Jha, Y. El Khamra, H. Kaiser, O. Weidner, A. Merzky, Developing adaptive scientific applications with hard to predict runtime resource requirements. In Proceedings of TeraGrid 2008 Conference (2008).
[27] A. Luckow, S. Jha, A. Merzky, B. Schnor, J. Kim, Reliable Replica Exchange Molecular Dynamics Simulation in the Grid using SAGA CPR and Migol. In Proceedings of UK e-Science 2008 All Hands Meeting, Edinburgh, UK (2008).
[28] Y. El-Khamra, S. Jha, Developing autonomic distributed scientific applications: a case study from history matching using ensemble kalman-filters. In Proceedings of the 6th International Conference on Autonomic Computing (ICAC); Industry session on Grids meets Autonomic Computing 19-28, New York, NY, USA (2009). ACM.
[29] A. Binczewski, N. Meyer, J. Nabrzyski, S. Starzak, M. Stroiński, J. Węglarz, First experiences with the polish optical internet. Comput. Netw. 37 (6):747-759 (2001).
[30] M. Kosiedowski, K. Kurowski, C. Mazurek, J. Nabrzyski, J. Pukacki, Workflow applications in gridlab and progress projects: Research articles. Concurr. Comput.: Pract. Exper. 18 (10), 1141-1154 (2006).
[31] K. Kurowski, W. Back, W. Dubitzky, L. Gulyás, G. Kampis, M. Mamonski, G. Szemes, M. Swain, Complex system simulations with qoscosgrid. In ICCS ‘09: Proceedings of the 9th International Conference on Computational Science, Berlin, Heidelberg, Springer-Verlag, 387-396 (2009).