Cookies on this website
We use cookies to ensure that we give you the best experience on our website. If you click 'Continue' we'll assume that you are happy to receive all cookies and you won't see this message again. Click 'Find out more' for information on how to change your cookie settings.
  • malariaAtlas: an R interface to global malariometric data hosted by the Malaria Atlas Project.

    11 October 2018

    BACKGROUND:The Malaria Atlas Project (MAP) has worked to assemble and maintain a global open-access database of spatial malariometric data for over a decade. This data spans various formats and topics, including: geo-located surveys of malaria parasite rate; global administrative boundary shapefiles; and global and regional rasters representing the distribution of malaria and associated illnesses, blood disorders, and intervention coverage. MAP has recently released malariaAtlas, an R package providing a direct interface to MAP's routinely-updated malariometric databases and research outputs. METHODS AND RESULTS:The current paper reviews the functionality available in malariaAtlas and highlights its utility for spatial epidemiological analysis of malaria. malariaAtlas enables users to freely download, visualise and analyse global malariometric data within R. Currently available data types include: malaria parasite rate and vector occurrence point data; subnational administrative boundary shapefiles; and a large suite of rasters covering a diverse range of metrics related to malaria research. malariaAtlas is here used in two mock analyses to illustrate how this data may be incorporated into a standard R workflow for spatial analysis. CONCLUSIONS:malariaAtlas is the first open-access R-interface to malariometric data, providing a new and reproducible means of accessing such data within a freely available and commonly used statistical software environment. In this way, the malariaAtlas package aims to contribute to the environment of data-sharing within the malaria research community.

  • Introduction to minitrack on development methods for electronic government

    16 October 2018

    This outline introduces the Minitrack on Development Methods for Electronic Government, organized as part of the Electronic Government Track at HICCS-45. After explaining the rationale and scope of the Minitrack, the paper presents the summaries of accepted papers. © 2012 IEEE.

  • Towards a framework for security in eScience

    16 October 2018

    This paper describes an approach to the formulation and classification of security requirements in eScience. It explains why it is untenable to suggest that 'one size fits all', and that what is an appropriate security solution in one context may not be at all appropriate in another. It proposes a framework for the description of eScience security in a number of different dimensions, in terms of measures taken and controls achieved. A distinctive feature of the framework is that these descriptions are organised into a set of discrete criteria, in most cases presented as levels of increasing assurance. The intended framework should serve as a basis for the systematic analysis of security solutions, facilitating the processes of design and approval, as well as for the identification of expectations and best practice in particular domains. The possible usage of the framework, and the value of the approach, is demonstrated in the paper through application to the design of a national data sharing service. © 2010 IEEE.

  • Semantic technologies in electronic government: Tutorial and workshop

    16 October 2018

    Joined-up government depends fundamentally on semantics - on the computable representation of meaning, so that data is associated with appropriate metadata from the start, and this association is maintained as the data is manipulated. This paper summaries a tutorial and workshop on semantic technologies for supporting electronic government. Copyright 2008 ACM.

  • Technological foundations of electronic governance

    16 October 2018

    This paper explores the relevance and opportunities for the application of mature Formal Techniques - techniques based on mathematical theories and supported by industry-ready tools and methods - to build technical solutions for Electronic Governance. The paper proceeds in four steps: (1) establishes the basic need for Formal Techniques in Electronic Governance, (2) identifies the challenges peculiar to Electronic Governance development, (3) presents the salient features and various application scenarios for Formal Techniques in general, and (4) carries out a mapping between the challenges to Electronic Governance and various application scenarios of Formal Techniques as part of solutions to such challenges. In the second part, the paper presents an overview of the tutorial and workshop on Formal Engineering Methods for Electronic Governance. The tutorial follows the fourstep program, as above, and the workshop includes the presentations of four papers that exemplify various elements of the mapping, particularly: the use of formal, precise modeling techniques; the importance of security risk assessment; model-driven development of software systems; and the provision of semantic frameworks to coordinate development within and across major programs and initiatives. In the last part, the paper discusses how Formal Techniques can contribute to establishing a solid foundation for Electronic Governance. Copyright 2007 ACM.

  • Semantic frameworks for e-Government

    16 October 2018

    This paper explains how semantic frameworks can be used to support successful e-Government initiatives by connecting system design to a shared understanding of interactions and processes. It shows how metadata standards and repositories can be used to establish and maintain such an understanding, and how they can be used in the automatic generation and instantiation of components and services. It includes an account of a successful implementation at an international level, and a brief review of related approaches. Copyright 2007 ACM.

  • Model-driven architecture for cancer research

    16 October 2018

    It is a common phenomenon for research projects to collect and analyse valuable data using ad-hoc information systems. These costly-to-build systems are often composed of incompatible variants of the same modules, and record data in ways that prevent any meaningful result analysis across similar projects. We present a framework that uses a combination of formal methods, model-driven development and service-oriented architecture (SOA) technologies to automate the generation of data management systems for cancer clinical trial research, an area particularly affected by these problems. The SOA solution generated by the framework is based on an information model of a cancer clinical trial, and comprises components for both the collection and analysis of cancer research data, within and across clinical trial boundaries. While primarily targeted at cancer research, our approach is readily applicable to other areas for which a similar information model is available. © 2007 IEEE.

  • A comparison of replication strategies for reliable decentralised storage

    16 October 2018

    Distributed hash tables (DHTs) can be used as the basis of a resilient lookup service in unstable environments: local routing tables are updated to reflected changes in the network; efficient routing can be maintained in the face of participant node failures. This fault-tolerance is an important aspect of modern, decentralised data storage solutions. In architectures that employ DHTs, the choice of algorithm for data replication and maintenance can have a significant impact upon performance and reliability. This paper presents a comparative analysis of replication algorithms for architectures based upon a specific design of DHT. It presents also a novel maintenance algorithm for dynamic replica placement, and considers the reliability of the resulting designs at the system level. The performance of the algorithms is examined using simulation techniques; significant differences are identified in terms of communication costs and latency. © 2006 ACADEMY PUBLISHER.

  • Toward provably correct models of ventricular cell function

    16 October 2018

    Researchers in cardiac mechanics and electrophysiology develop computer models for analyzing complex experimental data. A key issue is model correctness: formally verifying that the model is performing as intended. We present an application of formal software engineering methods to an established electrophysiology model: the Beeler-Reuter (B-R) model of the mammalian ventricular myocyte. A formal specification fragment for the B-R model is developed, which captures the key drivers of the transmembrane potential, including four ionic currents (INa, Is, Ix1, and IK1) and a representation for the intracellular calcium ion concentration ( [Ca] ). Correctness-preserving transformations can be used to refine the specification into executable code, thereby assuring a provably correct implementation. The mathematical and logical tools presented here provide a rigorous approach to proving the correctness of ventricular cell models, thereby improving program implementation and verification.

  • Accessibility

    9 March 2017

    Website Accessibility Statement

  • research

    30 July 2018

  • Getting here

    19 July 2018

  • Vacancies

    11 July 2017

  • About Us

    1 January 2012

    The Big Data Institute (BDI) is an interdisciplinary research centre that focuses on the analysis of large, complex, heterogeneous data sets for research into the causes and consequences, prevention and treatment of disease. Big Data methods are transforming the scale (breadth, depth and duration) and efficiency (data accumulation, storage, processing and dissemination) of large-scale clinical research. The work of the BDI requires people and projects that span traditional departmental boundaries and scientific disciplines, supported by technical resources to handle the vast quantities of data they generate.

  • For undergraduates

    1 January 2012

    Dramatically brand robust quality vectors for multimedia based functionalities. Dramatically evolve B2B services rather than innovative models.

  • For postgraduates

    1 January 2012

    Efficiently visualize inexpensive networks with leveraged opportunities. Phosfluorescently evolve exceptional opportunities before B2B opportunities.

  • Study With Us

    21 March 2017