## MEMBERS

### Program Members

Drastic improvements in processing speed as well as increased capacity of storage media of computers have made it possible to collect and analyze a large amount of data in order to construct more realistic models, which include uncertainty. In addition, large-scale simulations can be performed with a high degree of accuracy. Computational statistics has played a major role in science as this technology is important not only in traditional fields of science and technology, but also in social sciences, such as finance, economics, and policy making in the government.

Statistics can roughly be divided into two major areas: modeling and computation (simulation) . In England over a hundred years ago probabilistic modeling was first incorporated into statistics . However, the probabilistic concept was not fully introduced into computation until much later, about sixty years ago during WWII. It took place not in statistics, but in physics as part of the US’s development of atomic bombs. At that time, computations based on the probabilistic concept were called the Monte Carlo method.

Around that same time, computers were born, and they gradually became commonplace. As computers began to improve, they were naturally used in statistics. In particular, the processing power and storage size drastically increased in the 1990s, while the price of computers simultaneously decreased. These changes have led to a modern society where computers are taken for granted. Nowadays, it would not be an exaggeration to say that everyone has a computer. Especially since the advent of the Internet in the 1990s, computers have enabled instantaneous access to a wealth of information from around the globe. In addition, it became possible to store a large amount of data on personal computers.

Based on this background, there is a heightened expectation towards statistics in a vast range of fields as a technology to quickly process large amounts of data. In medical science and genetics, which traditionally have had a close relationship with statistics, the human genome project was completed in 2003, leaving a huge database of base sequences. Today, we can easily access this database via the Internet. Moreover, a variety of data on the global environment is being continuously transmitted from communication satellites, and statistics is used to conduct research on the weather forecast and global warming. In the business world, Point Of Sales (POS) data from convenience stores is gathered from all over the country, and the data is analyzed to predict sales and demand. In the financial world, globalization has advanced drastically during the last decade, which has led to diverse and complex financial products being created on computers and traded throughout the world to move huge sums of money.

Hence, statistics is used as a tool to analyze many different types of data. As an example, let us consider an expected value, which is the very basis of statistical computation. Because in the past, conditions like normal distribution and linearity were imposed on models, in many cases an expected value could be computed analytically or by a calculator. Looking at it from the other side, it is possible to conclude that modeling was performed in order to have a light load to compute statistical quantities. However, high speed and the large storage capacity of modern computers have enabled modeling with a large degree of freedom. Hence, various models, which assume large-scale numerical computations and include non-Gaussian, non-linear, and non-stationary models, can now be performed.

As described above, computational statistics is rapidly expanding its application fields. At the Graduate School of Mathematics, we not only conduct theoretical research in computational statistics, but are also pioneering new application fields as well as offer education and training of young researchers.