The building sector was responsible for nearly half of CO2 emissions in US in 2009. According to the US Energy Information Administration, buildings consume more energy than any other sector, with 48.7% of the overall energy consumption, and building energy consumption is projected to grow faster than the consumptions of industry and transportation sectors. In theory, large simulation ensembles can be leveraged from the initial design and/or retrofitting of buildings with data driven building optimization (including the evaluation of the building location, orientation, and alternative energy-saving strategies) to total cost of ownership (TCOs) simulation tools and day-to-day operation decisions. In practice, however, because of the size and complexity of the data, the varying spatial and temporal scales at which the key processes operate, analyzing simulation results are extremely costly. The energy simulation data management system (e-SDMS) software will address challenges that arise from the need to model, index, search, visualize, and analyze, in a scalable manner, large volumes of multi-variate series resulting from observations and simulations. In this NSF Software Infrastructure for Sustained Innovation funded project, we are collaborating with Johnson Control, Inc., a leader in information technologies for building energy systems.
Deviant and criminal groups flourish in virtual spaces because the actors can operate in relative anonymity without fear of shame or stigma. A recent report by FBI IC3 shows that companies’ loss from the cybercrime rose from 264.6 million dollars to 559.7 million dollars. The situation keeps getting worse when deviant and criminal groups make use of legitimate web services for their malicious command and control (C&C) communication channels. While there is some knowledge on the ways that vulnerabilities are exploited, there is little research exploring the ways that attack agents such as bots and malicious codes related to advanced persistent threats (APT) are distributed across cyberspace. Therefore, it is necessary to systematically investigate the creation, distribution, and attack patterns of attack agents circulating cyberspace. This vital information can be used to further investigate big-data driven intelligence related to adversarial threats and to detect and prevent such net-centric threats. This project will address analysis methods and develop tools to integrate, filter, analyze, and visualize analytics results. For the evaluation and testing, we will investigate big-data based evidences to develop a trustworthy correlation model for our malware analysis, introduce a new way to identify an adversary with the metrics that involve the measurement of attacks and the behavior and status of adversaries, and develop a system that can identify influential adversaries behind particular net-centric attacks. This project is being developed in collaboration with CAaNES.
The project's objective is to develop a methodology and tools for creating and populating a prototype real estate securitization chain. Residential and commercial mortgage loans are pooled and then securitized as mortgage backed securities (MBS). Several financial institutions participate along the chain, playing the “role” of mortgage originator, service provider, trustee, MBS issuer, etc. The MBS financial contract is represented by a “waterfall structure”, connecting the mortgage pools and the securities, and a set of “distribution rules” to control payments. This wealth of knowledge about the interconnected network of participant financial institutions, and the dynamics of the chain, is captured within unstructured or semi-structured financial contract documents. An initial challenge for MBS+ is to identify relevant sections of the MBS contract document; to determine a relevant template for knowledge extraction from each section; to develop extractors; and finally to validate the results. Integrating MBS+ with other heterogeneous datasets, e.g., historical payments against mortgages, performances of securities, etc. will complete the modeling of the supply chain. MBS contracts were at the center of the 2008 financial crisis. This project will demonstrate the use of text analytics and is in partnership with the IBM System T team and Fischer Real Estate Center at the Haas School at the University of California Berkeley.
This NSF Partnerships for Innovation: Building Innovation Capacity (PFI:BIC) funded project focuses on building a platform that will integrate data from multiple sources and explore data analysis techniques that can more accurately detect indications of financial fraud. The importance of the research discovery underpinning this project includes solving platform and processing challenges that arise from the need to integrate, filter, analyze, and visualize, in a secure and scalable manner, large private knowledge networks, also incorporating uncontrolled, unrestricted, untrusted, unstructured and unpredictable data from external domains. In this project, we are partnering with Early Warning Services, LLC, known throughout the financial services industry as a leader in fraud prevention and risk management. EarlyWarning, a limited liability company owned by Bank of America, BB&T, Capital One, JPMorgan Chase and Wells Fargo, provides its customers with fraud and risk management tools through collaboration and sharing of information within the industry.
The objective is to develop a synergistic decision framework to enable next-generation building clusters to work as an adaptive and robust system within a smart grid, reducing overall energy consumption and allowing for optimal operation decisions enabled by cyber support tools. It is envisioned that the next generation of building systems will freely form clusters, with buildings within clusters autonomously sharing and exchanging site-generated energy, thereby fundamentally transforming energy consumption in buildings, which is regarded today as the sector with the highest energy use. This is a collaborative effort among Arizona State University, Drexel University, the University at Buffalo - SUNY and Siemens, Inc. The team develops (1) an emulator for Netzero energy building clusters to benchmark and evaluate different operation strategies; (2) a methodology to generate and calibrate networked energy consumption models with high-fidelity for temporally and spatially distributed buildings, and (3) multi-time scale adaptive decision algorithms for dynamic operation strategies.
This is a collaborative effort between the School of Computing, Informatics, and Decision Systems Engineering at ASU and the Department of Radiology at Mayo Clinic Arizona. Our goal is to improve patient care by analyzing and managing information in radiology images and databases. We achieve this goal by developing novel informatics, statistics & machine learning, and systems engineering approaches. Specific research challenges include (a) System Informatics and Process Improvement, (b) Radiology Informatics & Analytics, and (c) Clinical Applications, including diagnosis, treatment response, multi-source information, multi-modality imaging, and genomics.
This Joint Path Finding (JPF) project's goal is to develop and mature the technologies needed to create and sustain a smart living environment. Specifically, this JPF will study, within one year, the feasibility of the Internet of Things (IOT) as a class of intelligent devices to enable the smart living environment. The project has two objectives: one is to conduct research in IOT related technologies to enable smart living, and the other is to mature technologies through proof-of-concept (POC) demonstrations and pilot deployments. The team will use ASU’s Sun Devil stadium renovation project as the usage scenario to focus the JPF. The anticipated benefit of the JPF is two-fold: to accelerate the deployment of IOT technologies at the Sun Devil stadium and to establish smart living laboratories at ASU and DCU with Intel’s participation and guidance.
This is a collaboration with Intel, Center for Ubiquitous Computing (CUBIC) at ASU, and Dublin City University (DCU).
Today, the lifecycle of television media assets is not limited to TV broadcasting. Instead, the life-cycle of the media that begins with the initial TV-based broadcasting, continues on social networks and internet, where content owners publish program fragments and users follow, view, enrich, and re-propagate these with additional comments, blog postings, and ratings. Moreover, the resulting media ecosystem of videos, news, comments, posts related to TV content are tightly linked through people and events that they refer to, the underlying knowledge network that related these people and events to each other, as well as the social network of viewers and commenters that interact with this media ecosystem. Consequently, the "second life" of TV programs are media- and information-rich, social, and highly-dynamic. The goal of this international collaboration with RAI Research and Technological Innovation Center is to define a model and develop a corresponding system for the extraction and integration of the heterogeneous and dynamic data coming from different knowledge sources (broadcasters' archives, online newspapers, blogs, web encyclopedias, social media platforms, social networks, etc.) and to use this data to improve on-line and broadcast TV experience.