APAC CIOOutlook

Advertise

with us

  • Technologies
      • Artificial Intelligence
      • Big Data
      • Blockchain
      • Cloud
      • Digital Transformation
      • Internet of Things
      • Low Code No Code
      • MarTech
      • Mobile Application
      • Security
      • Software Testing
      • Wireless
  • Industries
      • E-Commerce
      • Education
      • Logistics
      • Retail
      • Supply Chain
      • Travel and Hospitality
  • Platforms
      • Microsoft
      • Salesforce
      • SAP
  • Solutions
      • Business Intelligence
      • Cognitive
      • Contact Center
      • CRM
      • Cyber Security
      • Data Center
      • Gamification
      • Procurement
      • Smart City
      • Workflow
  • Home
  • CXO Insights
  • CIO Views
  • Vendors
  • News
  • Conferences
  • Whitepapers
  • Newsletter
  • Awards
Apac
  • Artificial Intelligence

    Big Data

    Blockchain

    Cloud

    Digital Transformation

    Internet of Things

    Low Code No Code

    MarTech

    Mobile Application

    Security

    Software Testing

    Wireless

  • E-Commerce

    Education

    Logistics

    Retail

    Supply Chain

    Travel and Hospitality

  • Microsoft

    Salesforce

    SAP

  • Business Intelligence

    Cognitive

    Contact Center

    CRM

    Cyber Security

    Data Center

    Gamification

    Procurement

    Smart City

    Workflow

Menu
    • Dell
    • Cyber Security
    • Hotel Management
    • Workflow
    • E-Commerce
    • Business Intelligence
    • MORE
    #

    Apac CIOOutlook Weekly Brief

    ×

    Be first to read the latest tech news, Industry Leader's Insights, and CIO interviews of medium and large enterprises exclusively from Apac CIOOutlook

    Subscribe

    loading

    THANK YOU FOR SUBSCRIBING

    • Home
    Editor's Pick (1 - 4 )
    left
    Meeting the IT Profitability Objective

    Steve Heilenman, CIO, Computer Aid Inc

    Open Source Cooperation: Expanding HPC Benefits to Customers

    PhilipPokorny, CTO, Penguin Computing

    Lessons from the Front Line

    John Graham, Vice President Technology Services and Contact Center, Patterson Companies

    Maintaining a Strong Foundation of Programming and Software Development

    Brian Haugabrook, CIO, Valdosta State University

    Implementing Technology Platforms To Enhance Customers Experience

    Robert Lux, SVP & CIO, Freddie Mac

    right

    Explore, Learn & Visualize Model Systems Safely in the Confines of a Computer - Computer Simulations

    Ravi Ravishanker, CIO & Associate Dean for WellesleyX, Wellesley College

    Tweet
    content-image

    Ravi Ravishanker, CIO & Associate Dean for WellesleyX, Wellesley College

    The exact definition of a computer simulation is beyond the scope of this article. In general computer simulations are computer programs that help in creating scenarios based on a mathematical or statistical model. The advantage of a computer simulation is that it can help create millions of possible states of the model that is being simulated or in some cases simulate the evolution of a system being studied over time. Computer simulations require enormous computer resources and because of the availability of faster computers and ease of parallelization, it is being used in many different fields such as the sciences, social and behavioral sciences and by the military to simulate wars and study the outcomes.

    The earliest computer simulations were in the area of meteorology and nuclear physics following World War II. Stanislaw Ulam and John von Neumann developed the Monte Carlo method to study nuclear fission by using a well known theoretical model in physics and chemistry called the “hard sphere” model. Though this helped us to build the nuclear bomb, “Monte Carlo” method has been developed further over the years and is in use in many areas today. Monte Carlo simulations are based on generating a lot of random numbers and it was so called because Ulam’s uncle used to gamble a lot in Monte Carlo by borrowing money from friends and relatives. However, as a grad student doing Monte Carlo simulations, I was told that the roulettes in Monte Carlo generated the best random numbers and thus the name!

    One of the earliest computer simulations was for the estimation of pi—the mathematical constant. This is called the Buffon’s needle problem. If a needle is dropped on a floor made of strips of wood, all of the same width, what is the probability that the needle will cross a line? Buffon solved this theoretically, and a computer simulation can be performed by dropping a needle randomly and calculating the number of times it touches one of the parallel lines compared to the total number of drops. This is an excellent and simple example from which you can learn a lot by changing the gap between the parallel lines, size of the needle and number of trials.

    Computer simulations have been used to study molecules ranging in size from a few atoms early on to very complex and large systems now.

    ​Almost all computer simulations are disk hogs and nearly all of the results need to be saved somewhere for future analyses

    Scientists have developed sophisticated models to describe atomic and molecular interactions and motions based on which simulations can be carried out and results can be visualized. Molecular dynamics offer a powerful means of studying the motions of complex molecules such as DNA and proteins, which helps to understand several biological processes as well to design drugs. Computer simulations have to start somewhere and these initial conditions have influence the results very much. Similarly, the quality of random numbers generated also affect the results very much. So, a lot of care must be exercised to make sure that the results of a simulation can be trusted.

    Virtual Labs for genetics provides excellent ways to understand genetics at your own pace and without going into any labs. I distinctly remember that when I took the course “Introduction to Biology - The Secret of Life” on edX, we used several online resources to understand the course materials. During a social gathering, one of my friends told me that I looked a little tired and asked whether I was coming from a golf outing. I said “No, I am tired from making fruit flies mate and study the genetic traits of their offsprings”. He thought I had a bit too much to drink, but in fact that was exactly what I was doing as a part of my course, using a virtual genetic lab that uses computer simulation to generate thousands of samples with specified genetic traits, cross them over a certain number of generations and provide an aggregate picture at the end. All happening in seconds!

    Computer simulations are the backbone of weather prediction. The models that describe weather have grown sophisticated over time and they are based on actual measurements of several weather related variables from various geographical locations. Simulations are then carried out using these models and certain initial conditions to predict the future. Since many of these are statistical in nature, there is a confidence level associated with the estimated properties. In many cases, literally thousands of different simulations with very different initial conditions are carried out and combined together to predict the final result. Most recently, these methods have been used to predict election results and sports. The use of computer simulations today is pervasive and it is hard to find a discipline where it is not used. Recently, it is being used to study radicalization and terrorism.

    Computer simulations require enormous computing resources. Researchers tend to become more ambitious as more resources become available and before you know they will be asking for more computing resources. Almost all computer simulations are disk hogs in that literally millions of snapshots of the system being studied are generated and all of them need to be saved somewhere for future analyses. Many simulations are parallelizable, but some are not. For example, if you are studying molecular dynamics of a system, you need to wait for the complete calculation of a step before moving to the next step. At each step you need to compute millions of distances that take up a long time. What scientists have done is to parallelize the calculation of distances in each step, collect them all together and then move to the next step. It is not only necessary to provide infrastructure but they must also be able to run massively parallel jobs. Visualizing the results is another important component of computer simulations, so we need to provide computers, graphics software and hardware that makes it possible. Virtual reality will make this even more demanding on IT support. Imagine the ability to delve right in the middle of a complex protein and watch it jiggle in real time!

    tag

    Virtual Reality

    Weekly Brief

    loading
    Top 10 Dell Solution Companies - 2021
    ON THE DECK

    Dell 2021

    I agree We use cookies on this website to enhance your user experience. By clicking any link on this page you are giving your consent for us to set cookies. More info

    Read Also

    Loading...
    Copyright © 2025 APAC CIOOutlook. All rights reserved. Registration on or use of this site constitutes acceptance of our Terms of Use and Privacy and Anti Spam Policy 

    Home |  CXO Insights |   Whitepapers |   Subscribe |   Conferences |   Sitemaps |   About us |   Advertise with us |   Editorial Policy |   Feedback Policy |  

    follow on linkedinfollow on twitter follow on rss
    This content is copyright protected

    However, if you would like to share the information in this article, you may use the link below:

    https://dell.apacciooutlook.com/views/explore-learn-visualize-model-systems-safely-in-the-confines-of-a-computer-computer-simulations-nwid-8274.html