- Home
- About us
- People & Research
- Library
- Results and awards
- Education
- Events calendar
- Jobs
The research tested the influence of various sources of pollution on mortality caused by short-term increases in ozone and nitrogen oxides. Our group engaged in the European part of the experiment.
In the field of environmental science, we examined the valuation of air pollution sources in terms of their contribution to the increase in mortality. The contribution of different sources of pollution to increased mortality due to a certain area is very different depending on the location, time mode and the chemical composition of produced substances. Understanding these differences and their quantification is essential for establishing effective regulatory measures. The existing modeling methods used mainly individual contribution to the source or group of sources to a change of the parameter. The effects of various sources of pollution on mortality caused by short-term increases in ozone and oxides of nitrogen for three areas, namely Europe, USA and Canada, and our group is engaged in the European part of the experiment. Experimental results show high spatial and temporal variability of the importance of different sources, ranging from benefits to reduce mortality caused by ozone quantifiable Europe to almost 320 million Euro per day for a 10% reduction of nitrogen oxides in Barcelona to zero or negative contribution of resources some areas of England and Holland. The work was presented at the conference ITM 2012 in the Netherlands and published in the Proceedings „Air Pollution Modelling and its Applications XXII“ (Springer, 2013) in the article „Pappin, A., Hakami A., Resler, J., Liczki, J., Vlcek, O.: Source attribution of air pollution abatement health benefits“.
Krylov subspace methods represent one of the most important classes of methods for solving systems of linear equations. The idea of authors write books together was established in 1997, work on the book itself lasted 10 years.
The Oxford University Press published in 2012 408 page monograph Jörg Liesenich (TU Berlin) and Zdenek Strakoš, Krylov Subspace Methods, Principles and Analysis (ISBN 978-0-19-965541-0). The idea of authors write books together was established in 1997, work on the book itself lasted 10 years. Krylov subspace methods represent one of the most important classes of methods for solving systems of linear equations (they were ranked among the ten most important algorithmic ideas of the 20th century). They are described in several books excellent first-class authors that reflect the state mainly as a result of the enormous algorithmic development lasting several decades. The present monograph is oriented in another way. Its aim is to describe the mathematical foundations of Krylov subspace methods. This necessarily puts into context a number of mathematical disciplines as well as the historical context that goes back several centuries and still is associated with the results of the latest development of computational mathematics and computational methods in science. A substantial part of the covered material is processed first book way, some results are new. The emphasis on interpretation rather than focus on algorithmic descriptions distinguishes submitted memoir from existing literature.
In the field of medical informatics and biostatistics in under study were identified different sets of genes that identify individuals at increased genetic risk of heart attack with the help of molecular genetic testing.
In the field of medical informatics and biostatistics were within the cardiovascular genetic studies have identified different sets of genes that identify individuals at increased genetic risk of heart attack with the help of molecular genetic testing. Some of them are aimed at increased risk of death within six months from the moment of attack. Methods for finding these sets of genes subsequently led to the award of three patents. Below are some brief description of one of these patents, patent No. 303,458 authors Zvárová J., I. Mazur, Z. Valenta, P. Feglarová and H. Grünfeldová called method of identifying those at increased genetic risk of death after myocardial infarction, publication date patent 19th 9th, 2012. Specific proposals genome-wide studies of gene expression in patients with acute myocardial infarction and paired controls with similar levels of clinical risk factors of acute myocardial infarction led to the identification of relatively narrow set of genes and transcripts, which is associated with increased risk of patient death within a relatively short period from the time of cardiac events. Extreme values of gene expression of selected genes may in practice in patients who develop acute myocardial infarction serve as an indicator of increased demand medical care and medical supervision, especially in a short period immediately after the patient's hospitalization due to acute myocardial infarction.
A positive answer would provide memory efficient and seamless solution for hundreds of practical problems. Our solved problem was at the most prestigious conferences cited as a threshold for which all known techniques fail.
In the field of theoretical computer science was a series of three articles Sima J. and S. Zak proven deep result (mathematical proof took about 40 pages), which contributes to solving one of the central open problems of computational theory (complexity theory). One of these articles was published in 2011 (LNCS 6651, Berlin: Springer - Verlag, pp. 120-133), the second in 2012 (LNCS 7147, Berlin: Springer - Verlag, pp. 406-418), and the third was sent for publication. This is a problem associated with the question of whether an accident can streamline calculations with limited memory. Illustrate the problem in the example from everyday life. Imagine that we need in an unfamiliar big city transport car without a map from one location (start) to another (target). Such a role for us today addresses the navigation. The situation is complicated by the fact that in the one-way street, various overpasses, underpasses, tunnels and closures. One option is to go from the start and try to systematically pass through all possible paths until we reach the finish line. This means that if we get to where we already were, or end up in a dead-end street, then we go back to the first intersection, which leads streets, we still missed the. However, this procedure requires us to remember where we have been, which may be in a big city with hundreds of streets problem. Another possibility is random through the city, when we throw at each intersection crown, whether we turn left or right, respectively. Appropriate die if the intersection is more complicated and have more opportunities to continue driving. It is interesting that we can mathematically prove that with high probability in a reasonable time we get to the finish. In addition, in this case you do not remember already traveled route. On the contrary, the main drawback of this approach is that we do not arrive at your destination at all (or in a given time period), although we know that the possibility of failure, accidental passage is unlikely. The question is whether there is a way of starting with certainty to finish, without throwing a crown, if we can remember as a few landmarks. The traffic problem already affects all aspects of computing with limited memory, so it can study the general question of whether it is possible to get rid of these calculations (unlikely) possibility of errors. A positive answer would provide memory efficient and seamless solution for hundreds of important practical problems. The fundamental importance of this issue for information technology illustrates the fact that for the solution of the simpler version (no one-way streets) among others in 2009 awarded the Gödel price (equivalent to the Nobel Prize in theoretical computer science). Since this problem is very difficult for a universal computer models (eg, Turing machines), examines the limited computational models. We are able to solve it for the 1- branching programs of width 3, during the calculation may be found in de facto only 3 states (for the solution of the original problem would be necessary to prove the relevant mathematical theorem for unlimited width). Although the above computational model is very weak, our problem was solved at the most prestigious conferences in theoretical computer science referred to as the limit for which all known techniques fail. In forthcoming memoir (S. Vadhan: Pseudorandomness), which cites our result, this limit is shifted to a width of 4, for which the problem is still open.
The authors' intention was to write a textbook for students in technical fields that would bring them into the world of matrix calculations and numerical mathematics.
The authors J. Duitjera Tebbens, I. Hnětynková, M. Plešinger, Z. Strakoš and P. Tichý was to write a textbook for students in technical fields that would bring them into the world of matrix calculations and numerical mathematics. Based on the authors' lectures at various universities was monograph titled Analysis of methods for matrix computations, which was published by Matfyzpress (ISBN 978-80-7378-201-6, 328 pages). Style interpretation balances the formality and clarity. The suitability of this style is given by the authors straddle the boundary between classical and applied mathematics, which alludes to the complexity of this world. Throat nature of the phenomenon verbal description often allows a much deeper understanding of the phenomenon, than it would be at the exact formalistic interpretation, the meaning of terms often inadmissible narrowed. Authors are based on Schur theorem, which forms the basis of many roads, and gradually interpreted orthogonal transformation basic matrix decompositions (QR, LU, SVD), solving least squares, partial eigenvalue problem and solving systems of linear algebraic equations using iterative methods. At the same time put a strong emphasis on analysis and understanding algorithms and computational side of things, including the behavior of the considered algorithms in an end-point arithmetic.