With the recent Nepal earthquake claiming more than 6,000 lives, many of us have often wondered why earthquakes cannot be predicted the same way as Tsunamis or cyclones are predicted? Scientists say that while it is possible to identify the key regions where earthquakes can occur, it is impossible to predict the exact time when they would occur. For example, just three weeks ago before the quake happened, a team of French and Nepali geologists discovered that 8.0 magnitude earthquakes struck in 700 year cycles. This region had last seen an earthquake in 1344 – about 670 years ago. While the geologists knew the high probability of the quake, they could not predict the exact time.
While past data is useful, predicting a earthquake is immensely difficult as it means analyzing not only past data but also predictors which show the probability of an earthquake. This includes factors such as high temperatures, gas emissions, strange animal behavior and abnormal cloud formations — appearing along the traditional fault lines. While geologists and scientists have been studying the patterns, the huge number of parameters that have to be looked at for arriving at a conclusion, make predicting earthquakes a near impossible task. Parameters are typically looked in isolation as it is not possible to crunch every parameter and arrive at an insight.
To have greater chances of predicting earthquakes, all diverse parameters have to be brought together and analyzed. Till date, this was a difficult task as the volume and diversity of data made it practically impossible for analysts to arrive at any meaningful analysis. However, with the advent of Big Data, there is hope.
Can earthquakes be predicted weeks before they occur?
An innovative company called Terra Seismic, believes that it has an answer, and more importantly believes that earthquakes can be predicted 20-30 days before they occur. Using Big Data and satellite technology, the firm processes large volumes of satellite data which is taken each day from regions where the probability of an earthquake is huge and ground based sensors.
This data is then combined with a huge number of earthquake precursors such as ground water level changes, sudden clouds, bizarre behavior of animals, birds and fishes, changes of the ground conductivity, geomagnetic and gravity anomalies, electromagnetic emissions, anomalous atmospheric electric field, geochemical aberrations such as excessive emissions of radon, hydrogen, helium, carbon dioxide, methane and other gases and fluids, and variations in seismic waves velocities. Algorithms built by the firm then analyze this combined data to judge the probability of an earthquake.
The company claims to have successfully predicted a number of earthquakes. For example, on 5th of April 2013, the firm issued a forecast for Japan. On 12th April 2013, an earthquake hit the identified area and 33 people were injured. On 4th June 2013, the firm again made a prediction for an earthquake in North Italy. On 21st June, an earthquake hit the identified area. On 3rd March 2013, the firm issued a forecast for an earthquake in Iran. Again, after 35 days, an earthquake hit the identified area.
While the firm has had a number of successes, it is clear that it is not perfect as it has not been able to detect every earthquake – such as the recent Nepal earthquake. That said, there is hope, as the usage of Big Data can help in making the system more accurate. More accuracy will help nations prepare more adequately for impending disasters, save lives and lessen the huge economic losses that are caused by earthquakes.