Tag Archives: big data

disruptors

The Disruptors

Disruption can mean a lot of things. Dictionary definitions include “a forcible separation” or division into parts. More recently it has come to mean a radical change in industry or business. This brings to mind huge technological innovations. But what if it’s as simple as realising that a handheld device for detecting nitrogen could also be used to gauge how much feed there is in a paddock; that drones can be adapted to measure pest infestations; that communities can proactively track the movement of feral animals.

These are just some of the projects that Cooperative Research Centres (CRCs) are working on that have the capacity to change crop and livestock outcomes in Australia, improve our environment and advance our financial systems.

Data and environment

Mapping pest threats

Invasive animals have long been an issue in Australia. But a program developed by the Invasive Animals CRC called FeralScan is taking advantage of the widespread use of smartphones to combat this problem.

The program involves an app that enables landholders to share information about pest animals and the impacts they cause to improve local management programs.

Peter West, FeralScan project coordinator at the NSW Department of Primary Industries, says the team wouldn’t have thought of a photo-sharing app without genuine community consultation.

The project has been running for six years and can record sightings, impacts and control activities for a wide range of pest species in Australia, including rabbits, foxes, feral cats, cane toads and myna birds. West says that it now has 70,000 records and photographs, and more than 14,000 registered users across the country.

Disruptors

“For regional management of high-impacting pest species, such as wild dogs, what we’re providing is a tool that can help farmers and biosecurity stakeholders detect and respond quickly to pest animal threats,” says West.

“It enables them to either reprioritise where they are going to do control work or to sit down and work with other regional partners: catchment groups, local biosecurity authorities and the broader community.”

The app won the Environment and Energy Minister’s award for a Cleaner Environment in the field of Research and Science excellence at the Banksia Foundation 2016 Awards in December. Recent improvements to the app include the ability to monitor rabbit bio-control agents.Plans for the future include upgrading the technology to alert farmers to nearby pest threats, says West.

Find out more at feralscan.org.au

Revising disaster warnings

Also in the information space, the Bushfire and Natural Hazards CRC (BNHCRC) is investigating reasons we don’t pay attention to or ignore messages that notify us of an impending fire or floods. Researchers are using theories of marketing, crisis communications and advertising to create messaging most likely to assist people to get out of harm’s way.

“The way we personally assess risk has a big impact on how we interpret messages. If I have a higher risk tolerance I will probably underestimate risk,” says Vivienne Tippett, BNHCRC project lead researcher and professor at Queensland University of Technology. “We’ve worked with many emergency services agencies to assist them to reconstruct their messages.”

Instead of an emergency message with a brief heading, followed by the agency name and then a quite technical paragraph about weather conditions and geography, Tippett’s team has worked on moving the key message up to the top and translating it into layperson terms. For example, a message might now say something like: “This is a fast-moving, unpredictable fire in the face of strong winds.”

Tippett’s team is constantly working with emergency services to make sure their findings are made use of as quickly as possible. “The feedback from the community is that yes, they understand it better and they would be more likely to comply” she says.

Find out more at bnhcrc.com.au

AgTech

Measuring plant mass and pests in crops

The Plant Biosecurity CRC is using unmanned aerial systems (UAS or drones) to improve ways to detect pest infestations in vast crops. Project leader Brian McCornack is based at the Kansas State University in the US.

“The driver for using unmanned aerial systems has been in response to a need to improve efficiency [reduce costs and increase time] for surveillance activities over large areas, given limited resources,” says McCornack. “The major game-changer is the affordability of existing UAS technology and sophisticated sensors.”

Disruptors
Unmanned aerial vehicle Credit: Kansas State University

The project is now in its third year and adds an extra layer of data to the current, more traditional system, which relies on a crop consultant making a visual assessment based on a small sample area of land, often from a reduced vantage point.

The international collaboration between the US and the Australian partners at QUT, Queensland Department of Agriculture and Fisheries, and the NSW Department of Primary Industries means the project has access to a wide range of data on species of biosecurity importance.

disruptors
Unmanned aerial system (drone) pilots, Trevor Witt (left) and Dr Jon Kok (right) from the Plant Biosecurity CRC project, discuss data collected from a hyperspectral camera. Credit: Brian McCornack, Kansas State University

The CRC for Spatial Information (CRCSI) has also been working on repurposing an existing gadget, in this case to improve the accuracy of estimating pasture biomass. Currently, graziers use techniques such as taking height measurements or eyeballing to determine how much feed is available to livestock in a paddock. However, such techniques can result in huge variability in estimates of pasture biomass, and often underestimate the feed-on-offer.

Professor David Lamb, leader of the Biomass Business project, says graziers underestimate green pasture biomass by around 50%. There could be a huge potential to improve farm productivity by getting these measures right.

Through case studies conducted on commercial farms in Victoria, Meat and Livestock Australia found that improving feed allocation could increase productivity by 11.1%, or up to $96 per hectare on average, for sheep enterprises, and 9.6% ($52 per hectare) for cattle enterprises.

The CRCSI and Meat and Livestock Australia looked at a number of devices that measure NDVI (the normalised difference vegetation index), like the Trimble Green Seeker® and the Holland Crop Circle®. The data collected by these devices can then be entered into the CRCSI app to provide calibrated estimates of green pasture biomass.

Graziers can also create their own calibrations as they come to understand how accurate, or inaccurate, their own estimates have been. These crowd-sourced calibrations can be shared with other graziers to increase the regional coverage of calibrations for a range of pasture types throughout the year.

Find out more at pbcrc.com.au and crcsi.com.au

Using big data on the farm

In July 2016, the federal government announced funding for a partner project “Accelerating precision agriculture to decision agriculture”. The Data to Decisions Cooperative Research Centre (D2D CRC) has partnered with all 15 rural research and development corporations (RDCs) on the project. 

“The goal of the project is to help producers use big data to make informed on-farm decisions to drive profitability,” says D2D CRC lead Andrew Skinner.

He says that while the project may not provide concrete answers to specific data-related questions, it will provide discussion projects for many issues and concerns that cross different rural industries, such as yield optimisation and input efficiencies. 

Collaboration between the 15 RDCs is a first in Australia and has the potential to reveal information that could shape a gamut of agricultural industries. “Having all the RDCs come together in this way is unique,” says Skinner. 

Global markets

The Capital Markets CRC, in conjunction with industry, has developed a system that allows it to issue and circulate many digital currencies, securely and with very fast processing times – and because it is a first mover in this space, has the potential to be a global disruptor.

Digi.cash is a spinoff of the Capital Markets CRC and is specifically designed for centrally issued money, like national currencies. 

“Essentially we have built the printing press for electronic coins and banknotes, directly suited to issuing national currencies in digital form, as individual electronic coins and banknotes that can be held and passed on to others,” says digi.cash founder Andreas Furche.

A currency in digi.cash’s system is more than a balance entry in an accounts database, it is an actual encrypted note or coin. The act of transfer of an electronic note itself becomes the settlement. This is in contrast to legacy systems, where transaction ledgers are created that require settlement in accounts. So there is no settlement or clearing period.

“We have a advantage globally because we were on the topic relatively early and we have a group of people who have built a lot of banking and stock exchange technologies in the past, so we were able to develop a product which held up to the IT securities standards used in banking right away,” says Furche.

Digi.cash is currently operating with a limit of total funds on issue of $10 million. It is looking to partner with industry players and be in a leading position in the development of the next generation financial system, which CMCRC says will be based on digitised assets.

Find out more at digi.cash

Defence

Passive radar, as developed by the Defence Science and Technology Group (DST), has been around for some time, but is being refined and re-engineered in an environment where radiofrequency energy is much more common.  

As recognition of the disruptive capabilities of this technology, the Passive Radar team at DST was recently accepted into the CSIRO’s innovation accelerator program, ON Accelerate.

Active radar works by sending out a very large blast of energy and listening for reflections of that energy, but at the same time it quickly notifies anyone nearby of the transmitter’s whereabouts.

“Passive radar is the same thing, but we don’t transmit any energy – we take advantage of the energy that is already there,” explains passive radar team member James Palmer.

The technology is being positioned as a complement for active radar. It can be used where there are more stringent regulations around radar spectrum – such as the centre of a city as opposed to an isolated rural area. Radio spectrum is also a finite resource and there is now so much commercial demand that the allocation for Defence is diminishing.

Although the idea of passive radar is not a new one – one of the first radar presentations in the 1930s was a passive radar demonstration – the increase in radiofrequency energy from a variety of sources these days means it is more efficient. For example, signals from digital TV are much more suited to passive radar than analogue TV.

“We are at the point where we are seeing some really positive results and we’ve been developing commercial potential for this technology,” Palmer says. “For a potentially risky job like a radar operator the ability to see what’s around you [without revealing your position], that’s very game changing.”

There is also no need to apply for an expensive spectrum licence. The Australian team is also the first in the world to demonstrate that it can use Pay TV satellites as a viable form of background radiofrequency energy. The company name Silentium Defence Pty Ltd has been registered for the commercial use of the technology.

Find out more at silentiumdefence.com.au

– Penny Pryor

For more CRC discovery, read KnowHow 2017.

You might also enjoy Beat the News with digital footprints.

beat the news

Beat the News with digital footprints

Every day we produce an almost unfathomable amount of data. Posting on Twitter, Facebook, Instagram and YouTube. Commenting in chat rooms; blogging; trading stock tips; and decorating hacks in niche forums. We broadcast what we’re eating, feeling and doing from our GPS-equipped smartphones, sharing maps of our runs, photos from shows, and news that gets us cranky or inspired.

The details of our passing moods are all there, creating a vital if virtual public pulse.

Dr Brenton Cooper’s Data to Decisions (D2D) CRC team checks this pulse and, by extracting signals from our collective digital footprint, shows where we’re going next.

Are we gearing up to strike? Or celebrate? Is disease spreading? What effect will an interest rates hike have? Are we about to toss out the government, or move money out of the market?

Whatever the social disruption, D2D CRC’s Beat The News ™ forecasting system can issue a warning – before it happens. In March 2016, it accurately forecasted the impact of an anti-coal port protest in Newcastle, NSW. The following May, no ships could move during the protest blockade, costing an estimated $20 million.


“This warning system tells you what might happen, when it will happen and why.”


Social media monitoring is already a billion-dollar industry, and Cooper, who is D2D CRC’s Chief Technology Officer, knows “there are plenty of tools that help you understand what’s happening right now. But this tells you what might happen, when it will happen and why.”

This sort of heads-up will be invaluable. D2D CRC’s first collaborators are Australia’s defence and national security agencies, whose analysts now have a Beat The News ™ dashboard that sifts through about two billion data points a day.

“These are people paid to understand the political climate, but they can’t read everything,” explains Cooper. “That’s where machine-enablement certainly helps.”

Maybe the agencies are watching Indonesian politics and want to know if there might be some unrest in the capital Jakarta. Beat The News ™ analyses a huge volume of open-source information, combining structured and unstructured data from a wide range of sources. It geo-locates posts, extracts key words, topics, trends and hashtags, and measures sentiment.

“Once we’ve done those types of data enrichments, we then pump it through a variety of models,” says Cooper, “to automatically and accurately predict the answer.”

The potential applications are many, so the CRC recently trademarked Fivecast™ – “as in forecast, only one better,” says Cooper – to take the system to market, whether as a spin-off company, licensing to a partner, or licensing the IP to a third party.

US company Dataminr has raised more than US$130 million from investors for its real-time analytics, but Cooper says Fivecast™ will offer a further capacity – event prediction. It’s the only predictive geopolitical risk analytics platform on the market. Corporate risk consultancies are already interested. Their clients include global mall conglomerates alert to anything that might stop people enjoying their shopping.

Find out more about Beat The News ™ at d2dcrc.com.au

– Lauren Martin

You might also enjoy ‘Disrupting Terrorism and Crime’. Sanjay Mazumdar, CEO of the Data to Decisions CRC (D2D CRC), takes a look at what the national security sector can learn from Big Data disruption.

national security data

Disrupting terrorism and crime

When people think about digital disruption they usually think of the peer-to-peer accommodation network AirBnB, or the inexpensive ride-sharing app Uber. These businesses have redefined their respective markets – with big data analytics1 underpinning their success.

Despite the fear that disruptive tech will bring with it new threats to security, Australia’s national security has much to benefit from the type of disruption brought about by big data – particularly when it comes to fighting terrorism and crime.

The national security sector faces the most imminent and complex big data challenges. This is because a powerful weapon of today’s terrorist or criminal is their ability to hide in data. They can plan and coordinate an attack or crime with impunity.

The ability for criminals to “hide in data” means that national security agencies are often faced with the daunting task of finding the “needle in the haystack” – where the haystack is growing at a phenomenal rate. In fact, people often comment that national security data analysts are “drowning in data, but starving for information”.

Big data analysts often need to find connections in vast, disparate volumes of data, where connections are imperceptible to humans but can be discovered using smart analytics and machine enablement.

The challenge is made greater by the wide variety of data sources (e.g. texts, voices, images, videos), the ever-increasing size and scale of the data collected, and the organisational and legislative silos impacting data agencies.

The effect of big data means that national security data analysts often spend most of their time collecting data, formatting it for analysis and generating reports, and less of their time doing the analysis. This is referred to as the “bathtub curve”.

The application of big data analytics is aimed at “inverting the bathtub”, which means automating the collection and processing of data to form intelligence. The generation of intelligence reports can also be automated via digital technologies, which enables analysts to spend more time analysing intelligence and making decisions.

The D2D CRC is developing applications to maximise the benefits that Australia’s national security sector can extract from Big Data. They are helping agencies generate timely and accurate intelligence as a powerful weapon against national security threats.

By addressing their big data challenges and applying high-performance analytics, the D2D CRC hopes it can support agencies in predicting threats rather than reacting to catastrophic aftermath. 

Sanjay Mazumdar

CEO, Data to Decisions CRC

Read next: Victoria’s Lead Scientist, Dr Amanda Caples, reveals the major flaw in traditional government approaches to disruption. 

Spread the word: Help Australia become digital savvy nation! Share this piece on digital disruptors using the social media buttons below.

More Thought Leaders: Click here to go back to the Thought Leadership Series homepage, or start reading the Women in STEM Thought Leadership Series here.

1 Big data is a term for any collection of data sets so large and complex they becomes difficult to store, process and analyse using current technologies. Big data analytics is the process of examining these data sets to uncover hidden patterns, unknown correlations, trends and other useful business information. 

drones startup

Remote Research Ranges: a startup story

Featured image above: Dr Catherine Ball, Telstra Business Woman of the Year in 2015, and CEO & Founder of geoethics, big data and drones startup Remote Research Ranges. 

When did Remote Research Ranges start and what stage is your business?

RRR is in the first 12 month stage of a startup, but we hit the ground running, and are already in good profit.  Collaboration is the new competition, so we are living that ideal.

What’s the solution your business provides?

RRR is an advisory company around drone technology, big data management, and geoethics (the ethics around geospatial data).  We are providing advice to international clients, state, federal, and local governments, as well as schools, universities, and rangers.

What have been the barriers in growing your business?

One of my biggest problems has been the ability to sit and focus on one particular thing.  When establishing your own business and managing your personal brand you tend to say yes to too much, and can risk spreading yourself too thin. I have learned that ‘No’ is a complete sentence.

What expertise have you tapped into to help you in your business journey?

I have been so lucky, as Telstra Business Woman of the Year (Corporate award 2015) that I have been welcomed into such a helpful and excellent alumni.  The awards opened up networks for me I could never have imagined.  Some of the people I have met have become mentors, sponsors, and even business partners.  It has been a game changing experience.

In your opinion, what is the most valuable thing that would support your business the long term?

I am really looking at longer term projects, such as the World of Drones congress that I am a co-creator of; this is going to be a long running and internationally expanding congress. It will allow me to really focus on fewer projects, as I will have sustainable income, and also develop very strong links across industry, so I have more choice about which projects I would like to work on.

What is the one thing you need to keep reminding yourself daily as a start-up going for sustainable growth?

Every day is a school day and I am learning a lot.  The key for me is to constantly be learning, reaching, and growing.  Looking for the ‘Blue Ocean’ and areas where the niches are either empty, or not created yet.  There is a saying, “the best way to predict the future is to create it” and I couldn’t agree more.

Dr Catherine Ball is CEO & Founder of geoethics, big data and drones startup Remote Research Ranges. 

Meet more of Australia’s leaders and researchers here.

You might also enjoy:

Australia’s most innovative women engineers

Maths researchers optimise Woodside’s vessel efficiency

Improving vessel efficiency featured image credit: Woodside Energy Ltd

Oil and gas company Woodside is streamlining its offshore operations with the assistance of new mathematical models developed in collaboration with a team of Curtin University academics.

This collaborative research project has focused on scheduling the support vessels that service Woodside-operated offshore facilities. The vessels are used for delivering supplies and for assisting with oil off-takes to oil tankers.

The most cost-efficient vessel routes are influenced by various constraints, including time windows – most facilities are only open during daylight hours – along with vessel speeds, vessel cargo capacities and the capability of each vessel to assist with oil off-takes, as not every vessel in the fleet is equipped for this operation.

Despite an industry-wide push into ‘big data’ computer technology over the past few years, the mathematical models in this project were so large that state-of-the-art optimisation software packages struggled to find good solutions, and in some cases couldn’t even begin processing the model.

New solution algorithms were consequently devised by the Curtin team and this work has been accepted to appear in the Journal of Industrial and Management Optimisation.

“One outcome of the project was providing Woodside with strong evidence for a business case to reduce the support fleet from four to three vessels – this is a significant saving since the cost of running an additional vessel is considerable,” says Curtin’s Associate Professor Ryan Loxton, who led the project.

“Another outcome was modelling the implications of changing the vessel schedule from a ‘taxi-style’ service whereby vessels would service facilities on demand, to a regular fixed schedule that is easier to deliver in practice.”

The Curtin team’s current focus is on developing more powerful optimisation algorithms that will allow for ‘on the fly’ dynamic optimisation of day-to-day and week-to-week vessel schedules.

“Major challenges include the current dynamic and uncertain operating environment, and the computational demands required. The standard solution algorithms are too slow for the problems that we encounter,” says Curtin’s Dr Elham Mardaneh, who worked on the project.

Although the models were highly customised to suit Woodside’s offshore operations, Mardaneh says that there is also considerable potential to adapt the technology to make optimal routing decisions in other industries such as mining.

“Mine sites also involve difficult vehicle routing problems, such as how to route haul trucks among different locations in the most optimal manner.”

– Blair Price

This article on vessel efficiency was first published by Science Network WA on 24 September 2016. Read the original article here.

robots

Blue technology revolution

Featured image above: Humanoid robots, like Ocean One, may soon replace human divers in carrying out deep or dangerous ocean research and engineering tasks. Credit: Osada/Seguin/DRASSM

An industrial revolution is unfolding under the seas. Rapid progress in the development of robots, artificial intelligence, low-cost sensors, satellite systems, big data and genetics are opening up whole new sectors of ocean use and research. Some of these disruptive marine technologies could mean a cleaner and safer future for our oceans. Others could themselves represent new challenges for ocean health. The following 12 emerging technologies are changing the way we harvest food, energy, minerals and data from our seas.

1. Autonomous ships

Credit: Rolls-Royce

You’ve heard of driverless cars – soon there may be skipperless ships. Ocean shipping is a $380 billion dollar industry. Like traffic on land, ocean traffic is a major source of pollution, can introduce invasive species, and even causes ocean road-kills. For example, over 200 whales were struck by ships in the past decade. Companies like Rolls Royce envision autonomous shipping as a way to make the future of the industry more efficient, clean and cost-effective. Skipperless cargo ships can increase efficiency and reduce emissions by eliminating the need for accommodation for crew, but will require integration of existing sensor technology with improved decision-making algorithms.

2. SCUBA droids

Credit: Osada/Seguin/DRASSM

SCUBA divers working at extreme depths often have less than 15 minutes to complete complicated tasks, and they submit their bodies to 10 times normal pressure. To overcome these challenges, a Stanford robotics team designed Ocean One: a humanoid underwater robot dexterous enough to handle archaeological artefacts that employs force sensors to replicate a sense of touch for its pilot. Highly skilled humanoid robots may soon replace human divers in carrying out deep or dangerous ocean research and engineering tasks.

3. Underwater augmented reality glasses

Credit: US Navy Photo by Richard Manley

Augmented and virtual reality technologies are becoming mainstream and are poised for enormous growth. The marine sector is no exception. US navy engineers have designed augmented vision displays for their divers – a kind of waterproof, supercharged version of Google Glass. This new tech allows commercial divers and search and rescue teams to complete complex tasks with visibility near zero, and integrates data feeds from sonar sensors and intel from surface support teams.

4. Blue revolution

Credit: InnovaSea

The year 2014 was the first in which the world ate more fish from farms than the wild. Explosive growth in underwater farming has been facilitated by the development of new aquaculture tech. Submerged “aquapod” cages, for example, have been deployed in Hawaii, Mexico, and Panama. Innovations like this have moved aquaculture further offshore, which helps mitigate problems of pollution and disease that can plague coastal fish farms.

5. Undersea cloud computing

Credit: Microsoft

Over 95% of internet traffic is transmitted via undersea cables. Soon, data may not only be sent, but also stored underwater. High energy costs of data centres (up to 3% of global energy use) have driven their relocation to places like Iceland, where cold climates increase cooling efficiency. Meanwhile, about 40% of people on the planet live in coastal cities. To simultaneously cope with high real estate costs in these oceanfront growth centres, reduce latency, and overcome the typically high expense of cooling data centres, Microsoft successfully tested a prototype underwater data centre off the coast of California last year. Next-generation underwater cloud pods may be hybridised with their own ocean energy-generating power plants.

6. New waves of ocean energy

Credit: Carnegie Wave Energy

The ocean is an enormous storehouse of energy. Wave energy alone is estimated to have the technical potential of 11,400 terawatt-hours/year (with sustainable output equivalent to over 400 small nuclear power plants). Technological innovation is opening up new possibilities for plugging into the power of waves and tides. A commercial project in Australia, for example, produces both electricity and zero-emission desalinated water. The next hurdles are scaling up and making ocean energy harvest cost-efficient.

7. Ocean thermal energy

Credit: KRISO (Korea Research Institute of Ships & Ocean engineering)

Ocean thermal energy conversion technology, which exploits the temperature difference between shallow tropical waters and the deep sea to generate electricity, was successfully implemented in Hawaii last year at its largest scale yet. Lockheed Martin is now designing a plant with 100 times greater capacity. Drawing cold water in large volumes up from depths of over 1 kilometre requires large flexible pipelines made with new composite materials and manufacturing techniques.

8. Deep sea mining

Credit: Nautilus Minerals

Portions of the seafloor are rich in rare and precious metals like gold, platinum and cobalt. These marine mineral resources have, up until now, lain mostly out of reach. New 300 tonne waterproof mining machines were recently developed that can now travel to some of the deepest parts of the sea to mine these metals. Over a million square kilometres of ocean have been gazetted as mining claims in the Pacific, Atlantic, and Indian oceans, and an ocean gold rush may open up as early as 2018. Mining the seafloor without destroying the fragile ecosystems and ancient species often co-located with these deep sea mineral resources remains an unsolved challenge.

9. Ocean big data

Credit: Windward

Most large oceangoing ships are required to carry safety sensors that transmit their location through open channels to satellites and other ships. Several emerging firms have developed sophisticated algorithms to process this mass influx of ocean big data into usable patterns that detect illegal fishing, promote maritime security, and help build intelligent zoning plans that better balance the needs of fishermen, marine transport and ocean conservation. In addition, new streams of imagery from nanosatellite constellations can be analysed to monitor habitat changes in near-real time.

10. Medicines from the seas

Credit: PharmaSea

The oceans hold vast promise for novel life-saving medications such as cancer treatments and antibiotics. The search for marine-derived pharmaceuticals is increasing in momentum. The European Union, for example, funded a consortium called PharmaSea to collect and screen biological samples using deep sea sampling equipment, genome scanning, chemical informatics and data-mining.

11. Coastal sensors

Image: Smartfin

The proliferation of low-cost, connected sensors is allowing us to monitor coastlines in ways never possible before. This matters in an ocean that is rapidly warming and becoming more acidic as a result of climate change. Surfboard-embedded sensors could crowd-source data on temperature, salinity and pH similar to the way traffic data is being sourced from drivers’ smartphones. To protect the safety of beachgoers, sonar imaging sensors are being developed in Australia to detect sharks close to shore and push out real-time alerts to mobile devices.

12. Biomimetic robots

Credit: Boston Engineering

The field of ocean robotics has begun borrowing blue prints from the world’s best engineering firm: Mother Nature. Robo-tuna cruise the ocean on surveillance missions; sea snake-inspired marine robots inspect pipes on offshore oil rigs; 1,400 pound crab-like robots collect new data on the seafloor; and robo-jellyfish are under development to carry out environmental monitoring. That ocean species are models for ocean problem-solving is no surprise given that these animals are the result of millions of years of trial and error.

Outlook

Our fate is inextricably linked to the fate of the oceans. Technological innovation on land has helped us immeasurably to clean up polluting industries, promote sustainable economic growth, and intelligently watch over changes in terrestrial ecosystems.

We now need ocean tech to do the same under the sea.

As the marine industrial revolution advances, we will need to lean heavily on innovation, ingenuity and disruptive tech to successfully take more from the ocean while simultaneously damaging them less.

– Douglas McCauley and Nishan Degnarain

This article was first published by World Economic Forum on 16 September 2016. Read the original article here.

big data

Big data, big business

Featured image above: Plume Labs use pigeons to monitor air quality in London. Credit: Plume Labs

Optimising highway networks, mapping crime hotspots and producing virtual reality sporting experiences based on real-life games: these are just a few of the exciting outcomes that new businesses are now achieving with complex data analysis. More and more startups are using readily available data to create products and services that are game changers for their industries.

Big data, for example, is what lies behind Uber’s huge success as a taxi alternative; the company optimises processes by using data analysis to predict peak times, journey time and likely destinations of passengers. Many other companies are now using data to produce technology-based solutions for a range of issues and even designing new ways to collect data.

A weather station and umbrella all in one

Wezzoo, a Paris-based start-up company, has designed a smart umbrella that tells users when it’s going to rain. The ‘oombrella’, as it’s been dubbed, is strikingly iridescent, sturdy in design, and presents a data-based solution to staying dry. It will send a notification to a smart phone 15 minutes before predicted rain and also send a reminder when it’s been left behind on a rainy day.

The oombrella itself is also a mobile weather station, able to detect temperature, atmospheric pressure, light and humidity. “Each oombrella will collect data and share it with the community to make hyperlocal weather data more accurate,” says the company.

Real-time meteorological information from each oombrella is uploaded to Wezzoo’s existing social weather service app. More than 200,000 people already use the app and upload their own weather reports from all over the world, creating a more interactive and collaborative approach to weather observation. This data, as well as information from weather stations is used to create personalised predictions for oombrella users.

‘Pigeon Air Patrol’ monitors pollution

Plume Labs, in collaboration with DigitasLBi and Twitter UK, have literally taken to the skies with their latest air pollution monitoring project, Pigeon Air Patrol. They recently strapped lightweight air-quality sensors to the backs of 10 London-based pigeons to gather data on pollution in the city’s skies. For the duration of the project, the public could tweet their location to @PigeonAir and receive a live update on levels of nitrogen dioxide and ozone, the main harmful gases in urban areas. Not only did this innovative project help collect data in new ways, it raised awareness of air pollution in large cities.

“Air pollution is a massive environmental health issue, killing nearly 10,000 people every year in London alone,” says Romain Lacombe, Plume Labs’ CEO.

“Air pollution is an invisible enemy, but we can fight back: actionable information helps limit our exposure, improve our health and well-being, and make our cities breathable.”

Plume’s core focus is tracking and forecasting ambient pollution levels to allow city dwellers to minimise harmful exposure to polluted air. Their free phone app – the Plume Air Report – uses data from environmental monitoring agencies and public authorities to provide individuals with real-time information on air pollution safety levels at their locations. With the use of environmental Artificial Intelligence, the app predicts air pollutant levels for 300 cities and 40 countries with double the accuracy of traditional forecasting methods. “Predictive technologies will help us take back control of our environment,” Lacombe says.

The company, whilst still small, has managed to raise seed funding from French banks. It plans to build a business based on aggregating data, though is also open to developing hardware.

Innovative data collection methods are not only good for science, it seems; they can also be a strong foundation for business.

This article was first published by the Australian National Data Service on 24 May 2016. Read the original article here.

supercomputer

Supercomputer empowers scientists

Creating commercial drugs these days seems to require more time at the keyboard than in the lab as these drugs can be designed on a computer long before any chemicals are combined.

Computer-based simulations test the design created by the theoretical chemist and quickly indicate any potential problems or enhancements.

This process generates data, and lots of it. So in order to provide University of Western Australia (UWA) chemistry researchers with the power to perform these big data simulations the university built its own supercomputer, Pople.

Dr Amir Karton, head of UWA’s computational chemistry lab says the supercomputer is named after Sir John Pople who was one of the pioneers of computational chemistry for which he won a Nobel Prize in 1998.

“We model very large systems ranging from enzymes to nano materials to design proteins, drugs and catalysts, using multi-scale theoretical procedures, and Pople was designed for such simulations,” Karton says.

“These simulations will tell you how other drugs will interact with your design and what modifications you will need to do to the drug to make it more effective.”

Pople was designed by UWA and while it is small compared to Magnus at the Pawsey Supercomputing Centre it gives the researchers exactly what they want.

That being a multi-core processor, a large and very fast local disk as well as 512 GB of memory in which to run the simulations.

Magnus’ power equivalent to 6 million iPads

While Magnus has nearly 36,000 processors—processing power equivalent to six million iPads running at once—Pople has just 2316 processors.

But, Magnus was designed with large computational projects like the Square Kilometre Array in mind whereas Pople provides such services to individual users.

Dr Dean Taylor, the faculty’s systems administrator says the total amount of memory available to Pople amounts to 7.8 TB, and the total amount of disk space is 153 TB, which could fill almost two thousand 80 GB Classic iPods.

By comparison a top-of-the-range gaming PC might have four processors, 16 GB of memory and a 2 TB disk drive.

A large portion of the Intel Xeon processors (1896 cores) were donated by Perth-based geoscience company DownUnder GeoSolutions.

DownUnder GeoSolutions’ managing director Dr Matthew Lamont says it is the company’s way of investing in the future.

Pople will also assist physics and biology research involving the nature of gravitational waves and the combustion processes that generate compounds important for seed germination.

– Chris Marr

This article was first published by ScienceNetwork Western Australia on 30 April 2016. Read the original article here.

innovation in western australia

Innovation in Western Australia

Science is fundamental for our future social and economic wellbeing.

In Western Australia we’re focusing on areas where we have natural advantages, and an appropriate base of research and industrial capacity. Western Australia’s Science Statement, released by Premier Barnett in April 2015, represents a capability audit of relevant research and engagement expertise in our universities, research institutes, State Government agencies and other organisations. Mining and energy, together with agriculture, are traditional powerhouses, but the science priorities also reflect the globally significant and growing capabilities in medicine and health, biodiversity and marine science, and radio astronomy. It’s a great place to begin exciting new collaborations.

The Science Statement has also helped to align efforts across research organisations and industry. For instance, in 2015 an industry-led “Marine Science Blueprint 2050” was released, followed by the Premier commissioning a roundtable of key leaders from industry, Government, academia and community to develop a long-term collaborative research strategy. These meetings highlighted critical areas of common interest such as decommissioning, marine noise, community engagement and sharing databases.


“Opportunities abound for science and industry to work together to translate research into practical, or commercial, outcomes.”


Science, innovation and collaboration are integral to many successful businesses in Western Australia. In the medical field, a range of technological innovations have broadened the economy and created new jobs. Some of these success stories include Phylogica, Admedus, Orthocell, iCeutica, Dimerix, Epichem and Proteomics International. Another example in this space is the Phase I clinical trial facility, Linear Clinical Research, which was established with support from the State Government – 75% of the trials conducted to date come from big pharmaceutical and biotechnology companies in the USA.

Opportunities abound for science and industry to work together to translate research into practical, or commercial, outcomes. For example, the field of big data analytics is rapidly permeating many sectors. Perth’s Pawsey Centre, the largest public research supercomputer in the southern hemisphere, processes torrents of data delivered by many sources, including radioastronomy as the world’s largest radio telescope, the Square Kilometre Array, is being developed in outback WA. In addition, local company DownUnder GeoSolutions has a supercomputer five times the size of Pawsey for massive geophysical analyses. In such a rich data environment, exciting new initiatives like the CISCO’s Internet of Everything Innovation Centre, in partnership with Woodside, is helping to drive innovation and growth.

Leading players in the resources and energy sector are also taking innovative approaches to enhance efficiency and productivity. Rio Tinto and BHP Billiton use remote-controlled driverless trucks, and autonomous trains, to move iron ore in the Pilbara. Woodside has an automated offshore facility, while Shell is developing its Prelude Floating Liquefied Natural Gas facility soon to be deployed off the northwest coast. Excitingly, 3 emerging companies (Carnegie, Bombora and Protean) are making waves by harnessing the power of the ocean to generate energy.

This high-tech, innovative environment is complemented by a rapidly burgeoning start-up ecosystem. In this vibrant sector, Unearthed runs events, competitions and accelerators to create opportunities for entrepreneurs in the resources space. Spacecubed provides fabulous co-working space for young entrepreneurs, including the recently launched FLUX for innovators in the resource sector. The online graphic design business Canva, established by two youthful Western Australians epitomises what entrepreneurial spirit and can-do attitude can achieve. In this amazingly interconnected world, the sky’s the limit.

Professor Peter Klinken

Chief Scientist of Western Australia

Read next: Professor Barney Glover, Vice-Chancellor and President of Western Sydney University and Dr Andy Marks, Assistant Vice-Chancellor (Strategy and Policy) of Western Sydney University on Making innovation work.

Spread the word: Help to grow Australia’s innovation knowhow! Share this piece using the social media buttons below.

Be part of the conversation: Share your ideas on innovating Australia in the comments section below. We’d love to hear from you!

cloud collaboration

Cloud collaboration

Featured image above: the Square Kilometre Array (SKA) by SKA Organisation

Cloud services – internet resources available on demand – have created a powerful computing environment, with big customers like the US Government and NASA driving developments in data and processing.

When building the infrastructure to support the Square Kilometre Array (SKA), soon to be the world’s biggest radio telescope, the International Centre for Radio Astronomy Research (ICRAR) benefitted from some heavyweight cloud computing experience.

ICRAR’s executive director, Professor Peter Quinn, says the centre approached cloud computing services company Amazon Web Services (AWS) to assess whether it could process the data from the SKA.

When operational in 2024, the SKA will generate data rates in excess of the entire world’s internet traffic.

Cloud collaboration
An artist’s impression of the Square Kilometre Array’s antennas in Australia. © SKA Organisation

ICRAR used an international consortium of astronomers to conduct a survey with the Janksy-VLA telescope, employing AWS to process the data, and they are now trying to determine how the services will work with a larger system.

Head of ICRAR’s Data Intensive Astronomy team, Professor Andreas Wicenec, says there are many options from AWS.

“Things are changing quickly – if you do something today, it might be different next week.”

Quinn says cloud systems assist international collaboration by providing all researchers with access to the same data and software. They’re also cost-effective, offering on-demand computing resources where researchers pay for what they use.

– Laura Boness

www.icrar.org

 

Rapid detection tool

Australian researchers develop Big Data tool to test new medicines

Australian scientists have developed a rapid detection tool to map the effects of new medicines already on the market, potentially saving millions of health practitioners from prescribing medicines with lesser-known yet serious side effects.

Lead researcher Dr Nicole Pratt, a senior research fellow at the University of South Australia‘s School of Pharmacy and Medical Sciences, has been working with the Asian Pharmacoepidemiology Network (AsPEN) to develop a mathematical algorithm that charts the temporal relationship between a new medicine and reports of adverse side effects around the globe.

“At the time a new medicine is first released onto the market less than 50% of the side effects are known.”

The rapid detection tool is able to quickly analyse large population datasets of up to 200 million people, containing information about the time a patient is prescribed a new medicine (captured at the point of purchase) and recorded hospitalisation events.

“We look at the link between starting a new medicine and a hospitalisation event and determine whether there is an association between those two events,” says Pratt.

At the time a new medicine is first released onto the market less than 50% of the side effects are know.

On average, new medicines are tested on less than 2000 people before they are prescribed – too few to determine if rarer, serious side effects exist.

Pratt’s rapid detection tool has the potential to become a real time surveillance tool for drug administration bodies, researchers and general practitioners, helping them to identifying the effects of new medications before they lead to widespread complications.

“We’d like to see it reach the point where we are constantly looking at the data and trying to capture problems as soon as they happen rather than let them happen for years and years and then do a big study to find that there have been a whole heap of heart attacks.”

The tool is already being used in several countries, including Japan, Korea, Taiwan, Canada and Australia to look at the side effects of a heartburn medication prescribed for reflux, and a medication for diabetes associated with heart failure.

In analysing the populations’ use of the heartburn medication, “all of the datasets found very similar results in terms of this medicine causing serious gastrointestinal infections,” says Pratt.

But when they analysed the diabetes medication, Pratt says they started to see differences between the five countries, indicating the drug might have a different effect on people depending on their ethnic background.

“When we looked at the association in the Asian population, we weren’t able to see the effect, but when we looked in the Caucasian population in Australia and Canada, we found the association.

“So the application is to start to look at whether there is some genetic differences in the way people respond to medicines and know what the risks and the benefits might be across ethnicities,” she says.

One of the challenges Pratt faced in developing the highly mathematical tool has been making it accessible for more people.

She says UniSA Professor Libby Roughead has been instrumental in helping her to apply the numerical tool visually in a “real-world” healthcare setting.

At the moment, “the datasets are held by either the governments or the hospitals in each of the countries, but the actual output of the tool should be available to general practitioners, scientists and regulators,” says Pratt.

“So what we are trying to do is visually provide an output to the people who are going to use it at the point of prescribing medicines.”

“Some of the things we’ve been trying to do is look at how the data can tell you stories, rather than just give you numbers.”

At the moment the tool produces a visual graph charting when medicines are prescribed and superseded across populations, while highlighting peaks in adverse effects at certain points in time.

“I’d like to see s this work integrated into the regulatory systems of all these countries and make it a world-wide surveillance system.”

Pratt met with her colleagues from AsPEN in Thailand this week, to discuss the global expansion of the rapid detection tool.

This article was first published by The Lead on 24 November 2015. Read the original article here.

Using Big Data to save tiny lives

Computer scientist Carolyn McGregor has developed a disruptive technology utilising big data, that is set to start a new era in personalised medicine. Her life-saving Artemis IT platform analyses patterns in data such as heartbeats and breathing in newborn babies and spots problems before they are apparent to medical staff. The approach has great potential to save lives and is now being applied beyond the neonatal intensive care ward to astronauts and tactical response units.

In 1999, computer scientist Carolyn McGregor found herself in a neonatal ward in Sydney’s Nepean Hospital, surrounded by newborn babies, each connected to a range of medical monitoring devices.

“I was watching all of these medical devices flash different numbers, alarms going off, and I was just looking at the sheer volume of the data and thinking there’s just such a rich source of data here and wondering what was happening with all the data that was on the screen,” she recalls.

McGregor, Canada Research Chair in Health Informatics based at the University of Ontario Institute of Technology, Oshawa, Canada discovered that measurements were being jotted down on paper charts every 30 or 60 minutes. “I thought, these numbers are changing every second or even faster. There’s so much we could potentially do with all of that,” she says.

That meeting was the spark for McGregor’s work in the use of big data in neonatal health and she is now a leading international researcher in critical-care health informatics. Before moving to Canada in 2007, McGregor established, grew and led Health Informatics Research at Western Sydney University, where her internationally recognised research was supported by over $1 million in grant funding from sources such as the Australian Research Council and the Telstra Broadband Fund. This was foundational research that led to her going on to establish her award winning Artemis Platform.

Typically a nurse in an intensive care ward watches a patient’s breathing and heartbeat, essentially to make sure they’re still alive and haven’t gone into cardiac arrest or another life threatening situation. But as McGregor suspected, the data can tell doctors and nurses so much more than that, when harnessed and analysed properly.

Subtle changes in the pattens of breathing, heart rate and other indicators can all show changes in the patient’s condition that might indicate something more serious, but are undetectable from traditional observation.

For instance, neonatal sepsis is the leading cause of death among new-born babies in both the developing and developed world.

“If you watch the behaviour of the heart, the heartbeat actually starts to become very regular or more regular if the body’s coming under stress, like it does when you have an infection. So because we watch every beat of the heart, we can tell if we’re starting to see a regular heart rate. Couple that with some other indicators and it gives doctors a better tool to help them to say this is probably infection,” says McGregor.

The Artemis platform which McGregor and her research team have developed records more than 1200 readings every second, helping doctors harness and manage all of the information that the medical devices produce, and providing a mechanism to analyse all that information in complex ways.  It allows them to choose which indicators and conditions they want to monitor, and track those important subtle changes.

It is a lifesaving technology  for the tiny patients where a few hours can make a major difference in recovery rates. “We can see these patterns sometimes 24 hours before the baby starts to really succumb and show signs of an aggressive infection,” McGregor says. Neonatal infections can cause lifelong health care issues for sufferers, such as with their lungs.

Along with improving outcomes for individual patients, the technology has the potential to help health care systems save money. For instance, if a baby acquires an infection in the neonatal unit then the length of their stay is typically doubled – a two-month stay becomes a four-month stay. Identifying and treating these infections earlier has the potential to slash these times.

So far the Artemis platform is being used in partnering hospitals in Canada, China and the USA. It has developed to the point where it is scalable and will be rolled out to more hospitals in the near future.

McGregor says neonatal babies are arguably the most complex patient population, so solving a problem for them first, means it will be easier to solve for other populations. Indeed, McGregor’s work has applications beyond neonatal critical care. Variations in the heartbeat, for instance, can indicate a viral or bacterial infection, the onset of depression, drowsiness, or post-traumatic stress disorder.

It also has application beyond the traditional healthcare sector. A conversation with former Canadian astronaut Dave Williams led to a joint project with the Canadian Space Agency and NASA on how the technology can be used to monitor the health of astronauts when they travel into space.

Astronauts share several similarities with neonatal babies, McGregor says. “Both have to do with adaption. There’s a physical body change when a baby is born, and when it’s born early the change happens before the body’s ready. The lungs have to start to functioning to provide oxygen to the body and the heart changes its function when you’re born. And when an astronaut goes into space, they have to deal with weightlessness, there is a risk from radiation and the impact of weightlessness on the body can cause problems. We need monitoring systems to help watch the body adapt,” she says.

There are plans to use the system on NASA’s planned journey to Mars in the next couple of decades, because there will be weeks at a time when the alignment of the moon and the planets cut the astronauts off from communication with Earth.

McGregor is also working with tactical response teams. When soldiers or police have to clear a building or rescue a hostage, their adrenalin can surge and their heart rate can accelerate to such an extent that they’re at risk of passing out. A platform called Athena gathers and monitors the soldiers’ physical indicators as they complete  virtual reality training and provides analytics of how their body is behaving during the training activity. In this way they can understand how they are behaving in those scenarios which  helps them learn how to control their physical reactions.


McGregor grew up in the Hills district of Sydney’s north-west and says she always had an affinity with maths and enjoyed logic puzzles, so her maths teacher suggested she study computing after finishing school.

She enrolled in computer science at the University of Technology Sydney and at the same time worked at St George bank as a computer science cadet. Following her studies, she joined and ultimately led a project at St George to set up what was then called an executive information system and would now be referred to as big data. “It was the first of the new type of computing systems to analyse the way the business ran as opposed to the computing systems that we originally had which were systems to help the company run,” she says.

After a stint at Woolworths using data to understand what customers were buying and how to group products in the store to induce them to spend more, McGregor enrolled at the University of Technology Sydney to do her PhD in computer science, and then began to teach part time at Western Sydney University.

It was then that Dr Mark Tracy, a neonatologist from the Nepean Hospital, approached Western Sydney University and said he’d like to work with the computing and maths departments because he had more data than he knew what to do with – a visit that set McGregor on her current path.

McGregor says the practical experience that many Australians gain during education by being required to spend time working in companies while they study, is invaluable and an opportunity that many other countries do not provide.

As McGregor completed her undergraduate degree, she was one of only five women in a class of around 100. Sometimes women in science and IT can have inferiority complexes she says.  But a well-functioning innovative environment needs different perspectives and people of different backgrounds, genders and cultures, she says.

“So for women I would say, acknowledge the skill set that you have and the abilities that you have. You have a fantastic potential to make a significant difference in the technology space.”

Australian workers are highly regarded overseas, she says. “I think the Australian culture is to just get in, contribute, make a difference, get it done. We have a very good reputation as a highly skilled workforce to come into companies, whether you’re bringing innovation or you’re just bringing commitment,” says McGregor.

While McGregor currently bases herself in Canada, she is is  an honorary professorial fellow at the University of Wollongong, south of Sydney which enables her to supervise students in Australia and also to bring her research to Australia.

McGregor says she is inspired by the possibilities for further innovations in the use of big data for medical research.

“I really think we’re just at that tip of the iceberg of a whole new wave of doing research in the medical space,” she says.

“This is the new face of health care. In partnership with genomics, for every individual using fitbits and other personalised devices, the way forward will be to manage your own health and wellness. We are building the platforms and tools to do this.”

– Christopher Niesche

This article was first published by Australia Unlimited on 29 October 2015. Read the original article here.

Read more about Carolyn McGregor here.

Better predictions

More information means better predictions

In the era of ‘big data’, researchers are reaping the rewards and making better predictions from working with increasingly vast amounts of information about our planet. And datasets don’t get much larger than those used for modelling climatic events and simulating the impacts of global warming on the Earth’s surface.

The primary tools for modelling the climate are Atmosphere–Ocean General Circulation Models (AOGCMs). To improve the credibility of AOGCMs, the World Climate Research Programme established the Coupled Model Intercomparison Project (CMIP). This facilitates comparison of different models to identify common deficiencies and stimulate investigation into their possible causes.

Better predictions: CMIP5

CMIP5 is the fifth phase of CMIP and a multi-model framework of unprecedented scale. It incorporates many more simulations than earlier versions, including those based on historical concentrations, experiments for investigating climate sensitivity, and four emission scenarios reflecting differing potential pathways to 2100.

Use of datasets produced by CMIP5 is widespread: several thousand researchers access the CMIP5 datasets via the Earth System Grid website, and 28 modelling groups worldwide work on models that input to CMIP5 activity. Over 1000 peer-reviewed papers using the datasets have been published in a range of respected climate journals, for example: Journal of Climate (184 papers), Geophysical Research Letters (129 papers), and Climate Dynamics (122 papers).

“Being the latest generation, the CMIP5 models are the most valuable resource we have in the field.”

In Australia, researchers at The Centre for Australian Weather and Climate Research, a partnership between the Bureau of Meteorology and CSIRO, have employed output from CMIP5 models to further our understanding of the current climate in the Pacific region and make better predictions about future climate.

The research, undertaken as part of the Pacific-Australia Climate Change Science Adaptation Planning (PACCSAP) program, provides insights into the current and future impacts of climate change on the Pacific and the implications for communities in the region.

The work further reinforces the strong credentials of climate research in Australia, which also boasts centres such as the ARC Centre of Excellence for Climate System Science at the University of New South Wales (see Share issue 21).

One of the research streams of PACCSAP has projected the impact of extreme weather events, such as tropical cyclones, onto the region’s future climate. The output from CMIP5 models was key to simulating the conditions for the genesis and behaviour of tropical cyclones.

“Being the latest generation, the CMIP5 models are the most valuable resource we have in the field,” says Dr Sally Lavender, Research Scientist at CSIRO’s Oceans and Atmosphere division. “The real advantage with CMIP5 is there are more models than the previous generation with a broader set of experiments, and all the models are much better in terms of sophistication. They also tend to be higher resolution and more have sub-daily time fields which, for modelling tropical cyclones, is very important.”

Dr Lavender is currently working to extend previous research using CMIP5 models to observe why and where cyclones form, and what determines their tracks. “We’re analysing the CMIP5 models to see how well they represent those processes in the real world to produce a selection of models that are good at representing tropical cyclones over the Australian region. We can then use these models to generate more informed projections of tropical cyclones under future climate scenarios.”

Research to date shows there is likely to be a reduction in the overall frequency of tropical cyclones in the Australian region; however, the proportion of high intensity cyclones is likely to increase. That needs to be taken into account in future building standards and disaster readiness planning.

Story provided by Refraction Media.

Originally published in Share, the newsletter magazine of the Australian National Data Service (ANDS).

One small step for open data…

NASA has a plan. Not one, in this case, about spaceships and astronauts, but something far more ‘down to earth’: open data. The organisation’s Plan for Increasing Access to the Results of Scientific Research was first published in late 2014, laying out NASA’s commitment to open up its datasets for international reuse. Full implementation of the plan is set to be in place from October 2015.

The plan aims, in NASA’s words, to “ensure public access to publications and digital data sets arising from NASA research, development, and technology programs”.

Done properly, opening up complex data sets for public analysis and reuse can lead to new and exciting discoveries, sometimes by those with nothing more than a keen amateur interest (or perhaps obsession) with the topic.

NASA is fully aware of this potential. It says it wants to support researchers to make new findings based on its data, not just in the US but around the globe. As if to prove the point, NASA’s Data Stories website highlights a number of case studies of people reusing its datasets in original applications, such as a ‘Solar System Simulator’ created by Canadian website developer Martin Vezina.

NASA also knows it needs to show commitment to scientific integrity and the accuracy of its research data and wants to encourage others to do the same. So by publishing its own datasets, NASA’s team are setting a benchmark for researchers hoping to grab a slice of the organisation’s annual research investment – a whopping US$3 billion. A condition of funding those research contracts, outlined in the 2014 document, is that researchers must develop their own data management plans describing how they will provide access to their scientific data in digital format. One small step for open data, one giant leap for new scientific discovery?

“This plan will ‘ensure public access to publications and digital data sets arising from NASA research, development, and technology programs’.”


How public data is being reused: The Australian Survey of Social Attitudes

The Australian Survey of Social Attitudes (AuSSA) is the main source of data for the scientific study of the social attitudes, beliefs and opinions of the nation.

It measures how those attitudes change over time as well as how they compare with other societies, which helps researchers better understand how Australians think and feel about their lives. Similar surveys are run in other countries, meaning data from AuSSA also allows us to compare Australia with countries all over the world.

Access to the AuSSA data has allowed independent researchers to explore changes in social attitudes in Australia over time. For example, Andrew Norton (now at the Grattan Institute in Melbourne) has analysed AuSSA to examine changes in attitudes towards same sex relationships between 1984 and 2009, noting the major shifts in favour of same sex relationships during that period.

AuSSA is often used as a reference point for public policy debate. A number of media articles have been based on its findings, discussing topics as diverse as climate change, the welfare state and the kindness of Australians.

Similarly Australian Policy Online includes 18 different papers making use of AuSSA, including papers on perceptions of democracy, population growth, cultural identity and tax policy.

AuSSA datasets can be accessed via its website.

With thanks to Steve McEachern, Director of the Australian Data Archive at Australian National University.


Story provided by Refraction MediaOriginally published in Share, the newsletter magazine of the Australian National Data Service (ANDS).

Featured image source (above): NASA.

 

Why DVDs are the new cool tech

In this era of big data, storage capacity is everything. To store the vast amount of data being generated requires an increasing number of large data centres. Some of which are industrial scale operations, consuming as much electricity as a small town.

In the quest for greater storage capacity technology, researchers at Swinburne University have achieved a technological breakthrough by increasing the storage capacity of DVDs from a meagre 4.7 gigabytes to a staggering 1000 terabytes. This is the equivalent of storing 50,000 high-definition movies.

Rapid commercialisation of the research has positioned it as a finalist under the best commercial deal category for the 2015 .

“Our first motivations were scientific curiosity: could we increase the storage capacity of the disc?” says the lead researcher Professor Min Gu. “The storage capacity of optical discs is determined by the number of dots that can be burned in to the disc, which in turn is determined by the wavelength of the laser used to burn the dots.”

“To put more dots on the disc beyond conventional DVDs, we had to address a physical limit. Our approach overcame the minimum dot size determined by the law to produce an extremely tiny spot of light.” Each dot on the disc is a binary digit, or bit, representing 0 or 1.

Optical discs have significant advantages over other data storage technologies – such as hard disk drives, USB flash drives and SD cards – in terms of cost, longevity and reliability. However, their low storage capacity has been their major limiting factor.

hf
Professor Min Gu, lead researcher at Swinburne University demonstrates the technology used to massively increase the storage capacity of DVDs.

Using nanotechnology, Gu and his colleagues Dr Xiangping Li and Dr Yaoyu Cao have developed a technique using two laser beams, instead of the conventional single beam, with different colours for recording onto the disc.

One beam, referred to as the ‘writing beam’ records the information, while the second beam inhibits the writing beam, essentially playing an anti-recording function. This produces a spot of light nine nanometres in effective diameter – around one ten thousandth the width of a human hair.

“One data centre at the moment can be the size of a football stadium. We can reduce the size to one box of discs,” explains Gu. The impact of this technology, however, goes beyond just storage capacity, and has significant implications for energy consumption.

“Big data storage already consumes 3% of electricity. If we record all the information produced by Australia in 2011, we have to use all the electricity consumed for domestic use that year. Optical discs are what we call ‘cool technology’ they don’t require cooling systems, and they also have along life times of around 20-30 years.”

Gu describes how the technology has progressed from publication of the research (co-first authored by Dr Zongsong Gan) in Nature Communications in 2013, to commercialisation.

“Two weeks after we published the results we received a call from the investment advisor for Optical Archive Inc. saying that ‘your technology will be very useful for big data.’”

Optical Archive Inc, which licensed the technology,  was purchased by Sony Corporation of America in May 2015.

Gu believes that the first prototype of the technology will be available in around three years’ time.

Carl Williams

Big data to solve global issues

Curtin University’s spatial sciences teams are using big data, advanced processing power and community engagement to solve social and environmental problems.

Advanced facilities and expertise at Perth’s Pawsey Supercomputing Centre support the Square Kilometre Array – a multi-array telescope due to launch in 2024 – and undertake high-end science using big data.

Individual computers at the $80 million facility have processing power in excess of a petaflop (one quadrillion floating point operations per second) – that’s 100,000 times the flops handled by your average Mac or PC.

Curtin University is a key participant in iVEC, which runs the Pawsey Centre, and a partner in the CRC for Spatial Information. As such, it is at the forefront of research efforts to use big data to solve global issues.

For instance, says the head of Curtin’s Department of Spatial Sciences Professor Bert Veenendaal, the university’s researchers are using Pawsey supercomputers to manage, compile and integrate growing volumes of data on water resources, land use, climate change and infrastructure.

“There is a rich repository of information and knowledge among the vast amounts of data captured by satellites, ground and mobile sensors, as well as the everyday usage information related to people carrying mobile devices,” he says.

“Increasing amounts of data are under-utilised because of a lack of knowhow and resources to integrate and extract useful knowledge,” he explains.

“Big data infrastructures coupled with increasing research in modelling and knowledge extraction will achieve this.”

Curtin’s projects include mapping sea-level rise and subsidence along the Western Australian coastline near Perth, generating high-resolution maps of the Earth’s gravity field and modelling climate over regional areas, such as Bhutan in South-East Asia, across multiple time scales.

Some research projects have the potential to expand and make use of big data in the future, particularly in the area of community development.

In one such project, the team worked with members of a rural community in the Kalahari Desert, Botswana, to collect information and map data using geographic information science. 

This helped the local community to determine the extent of vegetation cover in their local area, water access points for animals and how far the animals travelled from the water points to food sources.

Using this data, one local woman was able to create a goat breeding business plan to develop a herd of stronger animals. 

According to Veenendaal, there is potential for big data to be used for many regional and national issues. 

“Projects like this have the potential to provide data acquisition, analysis and knowledge that will inform intelligent decision-making about land use and community development on local, regional and national scales,” he says.

While procuring more funding for the Botswana project, Curtin’s researchers are planning future big data projects, such as applying global climate change models to regional areas across multiple time scales, and bringing together signals from multiple global navigation satellite systems, such as the USA’s GPS, China’s BeiDou and the EU’s Galileo. – Laura Boness

www.curtin.edu.au

www.crcsi.com.au 

www.ivec.org

Connecting science with industry