Tag Archives: society

New digital mentoring project announced

Featured image above: At IRCA’S launch of Indigenous digital mentoring project InDigiMOB. Credit: IRCA

A new digital mentoring project developed by the Indigenous Remote Communications Association (IRCA) was launched by the Minister for Local Government and Community Service, Honourable Bess Price in Arlparra on 15 April 2016.

Committed to addressing digital exclusion experienced by many Indigenous Australians in remote communities, the inDigiMOB program will play a key role in bridging the digital divide. It will establish a network of Indigenous mentors who provide peer-to-peer training and support in digital literacy, cyber safety and internet access in response to local and immediate needs.

“Over the next three years inDigiMOB will help address barriers to the take up and use of digital technology in a number of Indigenous communities in the Northern Territory,” says Daniel Featherstone, General Manager of IRCA.

“With support from local community organisations we will work to ensure remote Indigenous people are not left behind.”

“Where there is affordable and appropriate access, many Indigenous Australians in remote communities are rapid adopters of new digital technology,” says Featherstone.

“However, there are still too many people who lack access and capabilities to fully embrace the social, economic and cultural possibilities of being connected. inDigiMOB will address this and in doing so, create meaningful jobs.”

IRCA will pilot inDigiMOB in Arlparra and several Alice Springs Town Camps with support from delivery partners Batchelor Institute and Tangentyere Council respectively.

inDigiMOB is funded by Telstra.

This article was first published by IRCA on 21 April 2016. Read the original article here.

About IRCA

IRCA is the peak body that represents and advocates for the media and communications interests of remote and very remote Aboriginal and Torres Strait Islander communities in Australia.  IRCA promotes the important role played by remote Indigenous media in maintaining language and culture and providing tools for self-representation and community development.  IRCA supports remote Indigenous media organisations (RIMOs) that have played a key role in the development of the remote media and communications industry in Australia.

www.irca.net.au

Microcapsules for type-1 diabetes

Curtin University researchers are a step closer to establishing a way for people with type-1 diabetes to introduce insulin into the body without the need for injections, through the development of a unique microcapsule.

People with type-1 diabetes, a condition where the immune system destroys cells in the pancreas, generally have to inject themselves with insulin daily and test glucose levels multiple times a day.

Dr Hani Al-Salami from Curtin’s School of Pharmacy is leading the collaborative project using cutting-edge microencapsulation technologies to design and test whether microcapsules are a viable alternative treatment for people with type-1 diabetes.

“Since 1921, injecting insulin into muscle or fat tissue has been the only treatment option for patients with type-1 diabetes,” Al-Salami says.

“The ideal way to treat the illness, however, would be to have something, like a microcapsule, that stays in the body and works long-term to treat the uncontrolled blood glucose associated with diabetes.”

The microcapsule contains pancreatic cells which can be implanted in the body and deliver insulin to the blood stream.

“We hope the microcapsules might complement or even replace the use of insulin in the long-term, but we are still a way off. Still, the progress is encouraging and quite positive for people with type-1 diabetes,” Al-Salami says.

Researchers say the biggest challenge in the project to date has been creating a microcapsule that could carry the cells safely, for an extended period of time, without causing an unwanted reaction by the body such as inflammation or graft failure.

“We are currently carrying out multiple analyses examining various formulations and microencapsulating methods, in order to ascertain optimum engineered microcapsules capable of supporting cell survival and functionality,” Al-Salami says.

The research was conducted in partnership with the University of Western Australia. Click here to read the scientific paper, published in Biotechnology Progress.

– Susanna Wolz

This article was first published by Curtin University. Read the original media release here.

 

Australia’s STEM workforce

Featured image above from the Australia’s STEM Workforce Report

Australians with qualifications in science, technology, engineering and mathematics (STEM) are working across the economy in many roles from wine-makers to financial analysts, according to a new report from The Office of the Chief Scientist.

Australia’s Chief Scientist Dr Alan Finkel says Australia’s STEM Workforce is the first comprehensive analysis of the STEM-qualified population and is a valuable resource for students, parents, teachers and policy makers. The report is based on data from the 2011 Census, the most recent comprehensive and detailed data set of this type of information. The report will serve as a benchmark for future studies.

“This report provides a wealth of information on where STEM qualifications – from both the university and the vocational education and training (VET) sectors – may take you, what jobs you may have and what salary you may earn,” Finkel says.

“Studying STEM opens up countless job options and this report shows that Australians are taking diverse career paths.”

The report investigates the workforce destinations of people with qualifications in STEM fields, looking at the demographics, industries, occupations and salaries that students studying for those qualifications can expect in the workforce.

STEM workforce
Click here to see an infographic of key facts from the Australia’s STEM Workforce Report

The report found that fewer than one-third of STEM university graduates were female, with physics, astronomy and engineering having even lower proportions of female graduates. Biological sciences and environmental studies graduates were evenly split between the genders. In the vocational education and training (VET) sector, only 9% of those with STEM qualifications were women.

Finkel says that even more worrying than the gender imbalance in some STEM fields, is the pay gap between men and women in all STEM fields revealed in the report. These differences cannot be fully explained by having children or by the increased proportion of women working part-time.

The analysis also found that gaining a doctorate is a sound investment, with more STEM PhD graduates in the top income bracket than their Bachelor-qualified counterparts. However, these same STEM PhD holders are less likely to own their own business or work in the private sector.

Finkel says that preparing students for a variety of jobs and industries is vital to sustaining the future workforce.

“This report shows that STEM-qualified Australians are working across the economy. It is critical that qualifications at all levels prepare students for the breadth of roles and industries they might pursue.”

Click here to download the full Australia’s STEM Workforce report.

Click here to read Alan Finkel’s Foreword, or click here to read the section of the report that interests you.

This information was first shared by Australia’s Chief Scientist on 31 Mar 2016. Read the original media release here

CO₂ cuts nutrition

Climate change is affecting the Earth, through more frequent and intense weather events, such as heatwaves and rising sea levels, and is predicted to do so for generations to come. Changes brought on by anthropogenic climate change, from activities such as the burning of fossil fuels and deforestation, are impacting natural ecosystems on land and at sea, and across all human settlements.

Increased atmospheric carbon dioxide (CO₂) levels – which have jumped by a third since the Industrial Revolution – will also have an effect on agriculture and the staple plant foods we consume and export, such as wheat.

Stressors on agribusiness, such as prolonged droughts and the spread of new pests and diseases, are exacerbated by climate change and need to be managed to ensure the long-term sustainability of Australia’s food production.

Researchers at the Primary Industries Climate Challenges Centre (PICCC), a collaboration between the University of Melbourne and the Department of Economic Development, Jobs, Transport and Resources in Victoria, are investigating the effects of increased concentrations of CO₂ on grain yield and quality to reveal how a more carbon-enriched atmosphere will affect Australia’s future food security.

CO₂ cuts nutrition
An aerial view of the Australian Grains Free Air CO₂ Enrichment (AGFACE) project, where researchers are investigating the effects of increased concentrations of carbon dioxide on grain yield and quality.

Increasing concentrations of CO₂ in the atmosphere significantly increase water efficiency in plants and stimulate plant growth, a process known as the “fertilisation effect”. This leads to more biomass and a higher crop yield; however, elevated carbon dioxide (eCO₂) could decrease the nutritional content of food.

“Understanding the mechanisms and responses of crops to eCO₂ allows us to focus crop breeding research on the best traits to take advantage of the eCO₂ effect,” says Dr Glenn Fitzgerald, a senior research scientist at the Department of Economic Development, Jobs, Transport and Resources.

According to Fitzgerald, the research being carried out by PICCC, referred to as Australian Grains Free Air CO₂ Enrichment (AGFACE), is also being done in a drier environment than anywhere previously studied.

“The experiments are what we refer to as ‘fully replicated’ – repeated four times and statistically verified for accuracy and precision,” says Fitzgerald. “This allows us to compare our current growing conditions of 400 parts per million (ppm) CO₂ with eCO₂ conditions of 550 ppm – the atmospheric CO₂ concentration level anticipated for 2050.”

The experiments involve injecting CO₂ into the atmosphere around plants via a series of horizontal rings that are raised as the crops grow, and the process is computer-controlled to maintain a CO₂ concentration level of 550 ppm.

CO₂ cuts nutrition
Horizontal rings injecting carbon dioxide into the atmosphere as part of the AGFACE project. Credit: AGFACE

“We’re observing around a 25–30% increase in yields under eCO₂ conditions for wheat, field peas, canola and lentils in Australia,” says Fitzgerald.


Pests and disease

While higher CO₂ levels boost crop yields, there is also a link between eCO₂ and an increase in viruses that affect crop growth.

Scientists at the Department of Economic Development, Jobs, Transport and Resources have been researching the impact of elevated CO₂ levels on plant vector-borne diseases, and they have observed an increase of 30% in the severity of the Barley Yellow Dwarf Virus (BYDV).

CO₂ cuts nutrition
Higher CO₂ levels are linked with an increase in the severity of Barley Yellow Dwarf Virus.

Spread by aphids, BYDV is a common plant virus that affects wheat, barley and oats, and causes yield losses of up to 50%.

“It’s a really underexplored area,” says Dr Jo Luck, director of research, education and training at the Plant Biosecurity Cooperative Research Centre. “We know quite a lot about the effects of drought and increasing temperatures on crops, but we don’t know much about how the increase in temperature and eCO₂ will affect pests and diseases.

“There is a tension between higher yields from eCO₂ and the impacts on growth from pests and diseases. It’s important we consider this in research when we’re looking at food security.”


This increased yield is due to more efficient photosynthesis and because eCO₂ improves the plant’s water-use efficiency.

With atmospheric CO₂ levels rising, less water will be required to produce the same amount of grain. Fitzgerald estimates about a 30% increase in water efficiency for crops grown under eCO₂ conditions.

But nutritional content suffers. “In terms of grain quality, we see a decrease in protein concentration in cereal grains,” says Fitzgerald. The reduction is due to a decrease in the level of nitrogen (N2) in the grain, which occurs because the plant is less efficient at drawing N2 from the soil.

The same reduction in protein concentration is not observed in legumes, however, because of the action of rhizobia – soil bacteria in the roots of legumes that fix N2 and provide an alternative mechanism for making N2 available.

“We are seeing a 1–14% decrease in grain-protein concentration [for eCO₂ levels] and a decrease in bread quality,” says Fitzgerald.

“This is due to the reduction in protein and because changes in the protein composition affect qualities such as elasticity and loaf volume. There is also a decrease of 5–10% in micronutrients such as iron and zinc.”

This micronutrient deficiency, referred to as “hidden hunger”, is a major health concern, particularly in developing countries, according to the International Food Research Policy Institute’s 2014 Global Hunger Index: The challenge of hidden hunger.

There could also be health implications for Australians. As the protein content of grains diminishes, carbohydrate levels increase, leading to food with higher caloric content and less nutritional value, potentially exacerbating the current obesity epidemic.

The corollary from the work being undertaken by Fitzgerald is that in a future CO₂-enriched world, there will be more food but it will be less nutritious. “We see an increase in crop growth on one hand, but a reduction in crop quality on the other,” says Fitzgerald.

Fitzgerald says more research into nitrogen-uptake mechanisms in plants is required in order to develop crops that, when grown in eCO₂ environments, can capitalise on increased plant growth while maintaining N2, and protein, levels.

For now, though, while an eCO₂ atmosphere may be good for plants, it might not be so good for us.

– Carl Williams

www.piccc.org.au

www.pbcrc.com.au

Algorithms making social decisions

In this Up Close podcast episode, informatics researcher Professor Paul Dourish explains how algorithms, as more than mere technical objects, guide our social lives and organisation, and are themselves evolving products of human social actions.

“Lurking within algorithms, unknown to people and certainly not by design, can be all sorts of unconscious biases and discriminations that don’t necessarily reflect what we as a society want.”

The podcast is presented by Dr Andi Horvath. Listen to the episode below, or read on for the full podcast transcript.

Professor Paul Dourish

Dourish is a Professor of Informatics in the Donald Bren School of Information and Computer Sciences at University of California, Irvine, with courtesy appointments in Computer Science and Anthropology.

His research focuses primarily on understanding information technology as a site of social and cultural production; his work combines topics in human-computer interaction, social informatics, and science and technology studies.

He is the author, with Genevieve Bell, of Divining a Digital Future: Mess and Mythology in Ubiquitous Computing (MIT Press, 2011), which examines the social and cultural aspects of the ubiquitous computing research program. He is a Fellow of the Association for Computing Machinery (ACM), a member of the Special Interest Group for Computer-Human Interaction  (SIGCHI) Academy, and a recipient of the American Medical Informatics Association (AMIA) Diana Forsythe Award and the Computer Supported Co-operative Work (CSCW) Lasting Impact Award.

Podcast Transcript

VOICEOVER: This is Up Close, the research talk show from the University of Melbourne, Australia.

HORVATH: I’m Dr Andi Horvath. Thanks for joining us. Today we bring you Up Close to one of the very things that shapes our modern lives. No, not the technology as such, but what works in the background to drive it: the algorithm, the formalised set of rules governing how our technology is meant to behave.

As we’ll hear, algorithms both enable us to use technology and to be used by it. Algorithms are designed by humans and just like the underpinnings of other technologies, say like drugs, we don’t always know exactly how they work. They serve a function but they can have side-effects and unexpectedly interact with other things with curious or disastrous results.

Today, machine learning means that algorithms are interacting with, or developing other algorithms, without human input. So how is it that they can have a life of their own? To take us Up Close to the elusive world of algorithms is our guest, Paul Dourish, a Professor of Informatics in the Donald Bren School of Information and Computer Science at UC Irvine. Paul has written extensively on the intersection of computer science and social science and is in Melbourne as a visiting Miegunyah Fellow. Hello, and welcome to Up Close.

DOURISH: Morning, it’s great to be here.

HORVATH: Paul, let’s start with the term algorithm. We hear it regularly in the media and it’s even in product marketing, but I suspect few of us really know what the word refers to. So let’s get this definition out of the way: what is an algorithm?

DOURISH: Well, it is a pretty abstract concept so it’s not surprising if people aren’t terrible familiar with it. An algorithm is really just a way of going about doing something, a set of instructions or a series of steps you’ll go through in order to produce some kind of computational result. So for instance, you know, when we were at school we all learned how to do long multiplication and the way we teach kids to do multiplication, well that’s an algorithm. It’s a series of steps that you can go through and you can guarantee that you’re going to get a certain kind of result. So algorithms then get employed in computational systems, in computer systems to produce the functions that we want.

HORVATH: Where do we find algorithms? If I thought about algorithm-spotting say on the way to work, where do we actually encounter them?

DOURISH: Well, if you were to take the train, for instance, algorithms might be controlling the rate at which trains arrive and depart from stations to try to manage a stable flow of passengers through a transit system. If you were to Google something in the morning to look up something that you were going to do or perhaps to prepare for this interview, well an algorithm not only found the information for you on the internet, but it was also used to sort those search results and decide which one was the one to present to you at the top of the list and which one was perhaps going to come further down. So algorithms are things that lie behind the operation of computer systems; sometimes those are computer systems we are using directly and sometimes they are computer systems that are used to produce the effects that we all see in the world like for instance, the flow of traffic.

HORVATH: So Paul, we use algorithms every day in everything, whether it’s work, rest, play, but are we also being used by algorithms?

DOURISH: Well, I guess there’s a couple of ways we could think about that. One is that we all produce data; the things that we do produce data that get used by algorithms. If we want to think about an algorithm for instance that controls the traffic lights and to manage the flow of people through the streets of Melbourne, well, the flow of people through the streets of Melbourne is also the data upon which that algorithm is working. So we’re being used by algorithms in the sense perhaps that we’re all producing the data that the algorithm needs to get its job done.

But I think there’s also a number of ways in which we might start to think that we get enrolled in the processes and effects of algorithms, so if corporations and government agencies and other sorts of people are making use of algorithms to produce effects for us, then our lives are certainly influenced by those algorithms and by the kinds of ways that they structure our interaction with the digital world.

HORVATH: So algorithms that are responsible for say datasets or computational use, the people who create them are quite important. Who actually creates these algorithms? Are they created by governments or commerce?

DOURISH: They can be produced in all sorts of different kinds of places and if you were in Silicon Valley and you were the sort of person who had a brand new algorithm, you might also be the sort of person who would have a brand new start-up. By and large, algorithms are produced by computer scientists, mathematicians and engineers.

Many algorithms are fundamentally mathematical at their heart and one of the ways in which computer scientists are interested in algorithms is to be able to do mathematical analysis on the kinds of things that computers might do and the sort of performance that they might have. But computer scientists are also generally in the business of figuring out ways to do things and that means basically producing algorithms.

HORVATH: One of the reasons we hear algorithms a lot these days is because they’ve caused problems, or at least confusion. Can you give us some tangible examples of where that’s happened?

DOURISH: Sure. Well, I think we see a whole lot of those and they turn up in the paper from time to time, and some are kind of like trivial and amusing and some have serious consequences. From the trivial side and the amusing side we see algorithms that engage in classification, which is an important area for algorithmic processing, and classifications that go wrong, places where an algorithm decides that because you bought one product you are interested in a particular class of things and it starts suggesting all these things to you.

I had a case with my television once where it had decided because my partner was recording Rocky and Bullwinkle, which is an old 1970s cartoon series [for just] America featuring a moose and a squirrel, that I must be interested in a hunting program so it started recording hunting shows for me. So although they’re silly, they begin to show the way that algorithms have a role.

The more serious ones though are ones that begin to affect commerce and political life. A famous case in 2010 was what was called the flash crash, a situation in which the US stock market lost then suddenly regained a huge amount of value, about 10 per cent of the value of their system, all within half an hour, and nobody really knew why it happened. It turned out instead of human beings buying and trading shares, it was actually algorithms buying and trading shares. The two algorithms were sort of locked in a loop, one trying to offer them for sale and one trying to buy them up, and suddenly it spiralled out of control. So these algorithms, because they sort of play a role in so many different systems and appear in so many different places, can have these big impacts and in even those small trivial cases or ones that begin to alert us or tune us to where the algorithms might be.

HORVATH: Tell us about privacy issues; that must be something that algorithms don’t necessarily take seriously.

DOURISH: Well, of course the algorithm works with whatever data it has to hand, and data systems may be more or less anonymised, they may be more or less private. One of the interesting problems perhaps is that the algorithm begins to reveal things that you didn’t necessarily know that your data might reveal.

For example, I might be very careful about being tracked by my phone. You know, I choose to turn off those things that say for instance where my home is, but if an algorithm can detect that I tend to be always at the same place at 11 o’clock at night or my phone is always at the same place at 11 o’clock at night and that’s where I start my commute to work in the morning, then those patterns begin to build up and there can be privacy concerns there. So algorithms begin to identify patterns in data and we don’t necessarily know what those patterns are, nor are we provided necessarily with the opportunity to audit, control, review or erase data. So that’s where the privacy aspects begin to become significant.

HORVATH: Is there an upsurge about societal concerns about algorithms? Really, I’m asking you the question, why should we care about algorithms? Do we need to take these more seriously?

DOURISH: I think people are beginning to pay attention to the ways in which there can be potentially deleterious social effects. I don’t want to sit here simply saying that algorithms are dangerous and we need to be careful, but on the other hand there is this fundamental question about knowing what it is the algorithm is doing and being conscious of its fairness.

On the trivial side, there is an issue that arose around the algorithm in digital cameras to detect faces, when you want to focus on the face. It turned out after a while that the algorithms in certain phones looked predominantly for white faces but were actually very bad at detecting black faces. Now, those kinds of bias aren’t very visible to us, as the camera just doesn’t work. Those are perhaps where as a society we need to start thinking about what is being done for us by algorithms, because lurking within those algorithms, unknown to people and certainly not by design, can be all sorts of unconscious biases and discriminations that don’t necessarily reflect what we as a society want.

HORVATH: Are we being replaced by algorithms? Is this something that’s threatening jobs as we know it?

DOURISH: Well, I certainly see plenty of cases where people are concerned about that and talk about it, and there’s been some in the press in the last couple of years that talk for instance about algorithms taking over HR jobs in human resources, interviewing people for jobs or matching people for jobs. By and large though, lots of these algorithms are being used to supplement and augment what people are doing. I don’t think we’ve seen really large-scale cases so far of people being replaced by algorithms, although it’s certainly a threat that employers and others can hold over people.

HORVATH: Sure. Draw the connection for us between algorithms and this emerging concept of big data.

DOURISH: Well, you can’t really talk about one without the other; they go together so seamlessly. Actually, one of the reasons that I’ve been talking about algorithms lately is precisely because there’s so much talk about big data just now. The algorithms and the data go together. The data provides the raw material that the algorithm processes and the algorithm is generally what makes sense of that data.

We talk about big data not least in terms of this idea of being able to capture and collect, to get information from all sorts of sensors, from all sorts of things about the world, but it’s the algorithm that then comes in and makes sense of that data, that identifies patterns and things that we think are useful or interesting or important. I might have a large collection of data that tells me what everybody in Victoria has purchased in the supermarket for the last month, but it’s an algorithm that’s going to be able to identify within that dataset well, here are dual income families in Geelong or the sort of person who’s interested in some particular kind of product and amenable to a particular kind of marketing. So they always go together; you never have one without the other.

HORVATH: But surely there are problems in interpretation and things get lost in translation.

DOURISH: That’s a really interesting part of the whole equation here. It’s generally human beings have to do the interpretation; the algorithm can identify a cluster. It can say, look, these people are all like each other but it tends to be a human being who comes along and says now, what is it that makes those people like each other? Oh, it’s because they are dual income families in Geelong. There’s always a human in the loop here. Actually, the problem that we occasionally encounter, and it’s like that problem of inappropriate classification that I mentioned earlier, the problem is that often we think we know what the clusters are that an algorithm has identified until an example comes along that shows oh, that wasn’t what it was at all. So the indeterminacy of both the data processing part and the human interpretation is where a lot of the slippage can occur.

HORVATH: I’m Andi Horvath and you’re listening to Up Close. In this episode, we’re talking about the nature and consequences of algorithms with informatics expert Paul Dourish. Paul, given that algorithms are a formalised set of instructions, can’t they simply be written in English or any other human language?

DOURISH: Well, algorithms certainly are often written in English. There’s all sorts of ways in which we write them down. Sometimes they are mathematical equations that live on a whiteboard. They often take the form of what computer scientists call pseudo-code, which looks like computer code but isn’t actually executable by a computer, and sometimes they are in plain English. I used the example earlier of the algorithm that we teach to children for how to do multiplication; well, that was explained to them in plain English. So they can take all sorts of different forms. Really, that’s some of the difficulty about the notion of algorithm is this very abstract idea and it can be realised in many different kinds of ways.

HORVATH: So the difference between algorithms and codes and pseudo-codes are different forms of abstraction?

DOURISH: In a way, yes. Computer code is the stuff that we write that actually makes computers do things, and the algorithm is a rough description of what that code might be like. Real programs are written in specific programming languages. You might have heard of C++ or Java or Python, these are programming languages that people use to produce running computer systems. The pseudo-code is a way of expressing the algorithm that’s independent of any particular programming language. So if I have a great algorithm, an idea for how to produce a result or sort a list or something, I can express it in the pseudo-code and then different programmers who are working in different programming languages can translate the algorithm into the language that they need to use to get their particular work done.

HORVATH: Right. Now, I’ve heard one of the central issues is that we can’t really read the algorithm once it’s gone into code. It’s like we can’t un-cook the cake or reverse engineer it. Why is that so hard?

DOURISH: Well, we certainly can in some cases; it’s not a hard and fast rule. In fact, most computer science departments, like the one here at Melbourne, will teach people how to write code so that you can see what’s going on. But there are a couple of complications that certainly can make it more difficult.

The first is that the structure of computer systems requires that you do more things than simply what the algorithm describes. An algorithm is an idealised version of what you might do, but in practice I might have to do all sorts of other things as well, like I’m managing the memory of the computer and I’m making sure the network hasn’t died and all these things. My program has lots of other things in it that aren’t just the algorithm but are more complicated.

Another complication is that sometimes people write code in such a way that it hides the algorithm for trade secret purposes. I don’t want to have somebody else pick up on and get my proprietary algorithm or the secret source for my business or program, and so I write the software in a deliberately somewhat obscure way.

Then the other problem is that sometimes algorithms are distributed in the world, they don’t all happen in one place. I think about the algorithms for instance that control how data flows across the internet and tries to make sure there isn’t congestion and too much delay in different parts of the network. Well, those algorithms don’t really happen in one place, they happen between different computers. Little bits of it are on one computer and little bits of it are on the other and they act together in coordination to produce the effect that we desire, so it can be often hard to spot the algorithm within the code.

HORVATH: Tell us more about these curious features of algorithms. They almost sound like a life form.

DOURISH: Well, I think what often makes algorithms seem to take on a life of their own, if you will, is that intersection with data that we were talking about earlier, because I said data and algorithms go together. There is often a case for instance where I can know what the algorithm does but if I don’t know enough about the data over which the algorithm operates, all sorts of things can happen.

There’s a case that I like to use as an example that came from some work that a friend of mine did a few years ago where he was looking at the trending topics on Twitter, and he was working particularly with people in the Occupy Wall Street movement who were sure that they were censored because their movement, the political discussion around Occupy Wall Street, never became a trending topic on Twitter. People were outraged, how can Justin Bieber’s haircut be more important than Occupy Wall Street? When they talked to the Twitter people, the Twitter people were adamant that they weren’t censoring this, but nonetheless they couldn’t really explain in detail why it was that Occupy Wall Street had not become a trending topic.

You can explain the algorithm and what it does, you can explain the mathematics of it, you can explain the code, you can show how a decision is made, but that decision is made about a dataset that’s changing rapidly, that’s to do with everything that’s being Tweeted online, everything that’s being retweeted, where it’s being retweeted, where it’s being retweeted, how quickly it’s being retweeted. What the algorithm does, even though it’s a known, engineered artefact, is still itself somehow mysterious.

So the lives that algorithms take on in practice for us when we encounter them in the world or when they act upon us or when they pop up in our Facebook newsfeed or whatever, is often unknowable and mysterious and lively, precisely because of the way the algorithm is entwined with an ever roiling dataset that keeps moving.

HORVATH: I love the term machine learning, and it’s really about computers interacting with computers, algorithms talking to other algorithms without the input of humans. That kind of spooks me. Where are we going?

DOURISH: Yeah. Well, I think the huge, burgeoning interest in machine learning has been spurred on by the big data movement. Machine learning is something that I was exposed to when I was an undergraduate student back more years ago than I care to remember; it’s always been there. But improvements in statistical techniques and the burgeoning interest in big data and the new datasets mean that machine learning has taken on a much greater significance than it had before.

What machine learning algorithms typically do is they identify again patterns in datasets. They take large amounts of data and then they tell us what’s going on in that. Inasmuch are we are generating more and more data and inasmuch as more and more of our activities move online and then become, if you like, “datafiable”, things that can now be identified as data rather than just as things we did, there is more and more opportunity for algorithms, and particularly for machine learning algorithms, to identify patterns within that.

I think the question, as we said, is to what extent one knows what a machine learning algorithm is saying about one. Indeed, even, as I suggested with the Twitter case, even for people who work in this space, even for people who are developing the algorithms, it can be hard for them to know. It’s that sort of issue of knowing, of being able to examine the algorithms, of making algorithms accountable to civic, political and regulatory processes, that’s where some of the real challenges are that are posed by machine learning algorithms.

HORVATH: We’re exploring the social life of algorithms with computer and social scientist Paul Dourish right here on Up Close. And yes, we’re coming to you no doubt thanks to several useful algorithms. I’m Andi Horvath. Let’s keep moving with algorithms. You say that algorithms aren’t just technical, that they’re social objects. Can you tell us a bit more what that means?

DOURISH: Well, I think we can come at this from two sides. One side is the algorithms are social as well as technical because they’re put to social uses. They’re put to uses that have an impact on our world. For example, if I’m on Amazon and it recommends another set of products that I might like to look at, or it recommends some and not others, there’s some questions in there about why those ones are just the right ones. Those are cases where social systems, systems of consumption and purchase and identification and so forth are being affected by algorithms. That’s one way in which algorithms are social; they’re put to social purposes.

But of course, the other way that algorithms are social is that they are produced by people and organisations and professions and disciplines and all sorts of other things that have a grounding in the social world. So algorithms didn’t just happen to us, they didn’t fall out of the sky, we have algorithms because we make algorithms. And we make algorithms within social settings, and they reflect our social ideas or our socially-constructed ideas about what’s desirable, what’s interesting, what’s possible and what’s appropriate. Those are all ways in which the algorithms are pre-social. They’re not just social after the fact but they are social before the fact too.

HORVATH: Paul, you’ve mentioned how algorithms are kind of opaque, but yet you also mention that we need to make them accountable, submit them to some sort of scrutiny. So how do we go about that?

DOURISH: This is a real challenge that a number of people have been raising in the last couple of years and perhaps especially in light of the flash crash, that moment where algorithmic processing produced a massive loss of value on the US stock market. There are a number of calls for corporations to make visible aspects of their own algorithms and processing so that it can be deemed to be fair and above board. If you just think for a moment about how much of our daily life in the commercial sector is indeed governed by those algorithms and what kind of impact a Google search result ordering algorithm has; there’s lots of consequences there, so people have called for some of those to be more open.

People have also called for algorithms to be fixed. This is one of the other difficulties is that algorithms shift and change; corporations naturally change them around. There was some outrage when Facebook indulged in an experiment in order to see whether they could tweak the algorithms to give people happier or less happy results and see if that actually changed their own mood and what kinds of things they saw. People were outraged at the idea that Facebook would tweak an algorithm that they felt, even though it obviously belonged to Facebook, was actually an important part of their lives. So keeping algorithms fixed in some sense is one sort of argument that people have made, and opening things up to scrutiny.

But the problem with opening things up to scrutiny is well, first, who can actually evaluate these things? Not all of us can. And also of course that in the context of machine learning, the algorithm identifies patterns in data, but what’s the dataset that we’re operating over? In fact, we can’t even really identify what those things are, we’re only saying there’s a statistical pattern and that some human being is going to come along and assign some kind of value to that. So some of the algorithms are inherently inscrutable. The algorithm processes data and we can say what it says about the data, but if we don’t know what the data is and we don’t know what examples it’s been trained on and so forth, then we can’t really say what the overall effect and impact is.

HORVATH: Will scrutiny of algorithms, whether we audit or control them, be affected by, say, intellectual property laws?

DOURISH: Well, this is a very murky area, and in particular it’s a murky area internationally, where there are lots of different standards in different countries about what kind of things can be patented, controlled and licensed and so forth. Algorithms themselves are patentable objects. Many people choose to patent their algorithms, but of course patenting something requires disclosing it and so lots of other corporations decide to protect their algorithms as trade secrets, which are basically just things you don’t tell anybody.

The question that we can ask about algorithms is actually also how they move around in the world and those intellectual property issues, licensing rights, patenting and so forth are actually ways that algorithms might be fixed in one place within a particular corporate boundary but also move around in the world. So no one has really I think got a good handle on the relationship between algorithms and intellectual property.

They are clearly valuable intellectual property, they get licensed in a variety of ways, but this is again one of these places where the relationship between algorithm and code is a kind of complicated one. We have developed an understanding of how to manage those things for code; we have a less good understanding right now of how to manage those things for algorithms. I should probably say, since we’re also talking about data, no idea at all about how to do this for data.

HORVATH: These algorithms, they’ve really got a phantom-like presence and yet they’ve got so much potential and possibility. They are practical tools that help with our lives. But what are the consequences of further depending upon the algorithms in our world?

DOURISH: I think it’s inevitable and not really problematic. From my perspective, algorithms in and of themselves are not necessarily problematic objects. Again, if we say that even the things that we teach our children for how to do multiplication are algorithms, there’s no particular problem about depending on that. I think again it’s the entwining of algorithms and data, and one of the things that an algorithmic world demands is the production of data over which those algorithms can operate, and all the questions about ownership and about where that algorithmic processing happens matter.

For example, one manifestation of an algorithmic and data-driven world is one in which you own all your data and you do the algorithmic processing and then make available the results if you so choose. Another version of that algorithmic and [data-centred/data-central] world is one in which somebody else collects data about you and they do all the processing and then they tell you the results, and there’s a variety of steps in between. So I don’t think the issue is necessarily about algorithms and how much we depend on algorithms. Some people have claimed we’re losing our own ability to remember things because now Google is remembering for us.

HORVATH: It’s an outsourced memory.

DOURISH: Yes, that’s right, or there’s lots of things about people using their Satnav and driving into the river, right, because they’re not anymore remembering how to actually drive down the road or evaluate the things in front of them, but I’m a little sceptical about those. I do think the question about how we want to harness the power of algorithmic processing, how we want to make it available to people, and how it should inter-function with the data that might be being collected from or about people, those are the questions that we need to try to have a conversation about.

HORVATH: Paul, I have to ask you, just like we use our brain to understand our brain, can we use algorithms to understand and scrutinise algorithms?

DOURISH: [Laughs] Well, we can and actually, we do. One of the ways in which we do already is that when somebody develops a new machine learning algorithm we have to evaluate how well it does. We have to know is this algorithm really reliably identifying things. We sort of pit algorithms against each other to try to see whether the algorithm is doing the right work and evaluate the results of other kinds of algorithms. So that actually already happens.

Similarly, as I suggested on the internet, the algorithm for congestion control is really a series of different algorithms happening in different places that work cooperative or not in order to produce or not a smooth flow of data. Though we don’t have to worry just yet I think about a sort of war between the algorithms or any kind of algorithmic singularity.

HORVATH: Paul, what do you mean by the singularity? Is this really a Skynet moment?

DOURISH: Well, the singularity is this concept that suggests that at some point in the development of intelligent systems, they may become so intelligent that they can design their own future versions and the humans become irrelevant to the process of development. It’s a scary notion; it’s one I’m a little sceptical about, and I think actually the brittleness of contemporary algorithms is a great example of why we’re not going to get there within any short time.
I think the question though is still how do we want to understand the relationship between algorithms and the data over which they operate? A great example is IBM’s Watson, which a couple of years ago won the Jeopardy TV show, and this was a real breakthrough for artificial intelligence. But on the other hand you’ve got to task, what is it that Watson knows about? Well, a lot of what Watson knows it knows from Wikipedia and I’m not very happy when my students cite Wikipedia and I’m not terribly sure that I need to be afraid of the machine intelligence singularity that also is making all its inferences on the basis of Wikipedia.

HORVATH: Paul, thanks for being our guest on Up Close and allowing us to glimpse into the world of the mysterious algorithm. I feel like I’ve been in the movie Tron.

DOURISH: [Laughs] Yes, well, we don’t quite have the glowing light suits unfortunately.

HORVATH: We’ve been speaking about the social lives of algorithms with Paul Dourish, a professor of informatics in the Donald Bren School of Information Computer Science at UC Irvine. You’ll find a full transcript and more info on this and all our episodes on the Up Close website. Up Close is a production of the University of Melbourne, Australia. This episode was recorded on 22 February 2016. Producer was Eric van Bemmel and audio recording by Gavin Nebauer. Up Close was created by Eric van Bemmel and Kelvin Param. I’m Dr Andi Horvath. Cheers.

VOICEOVER: You’ve been listening to Up Close. For more information visit upclose.unimelb.edu.au. You can also find us on Twitter and Facebook.

– Copyright 2016, the University of Melbourne.

Podcast Credits

Host: Dr Andi Horvath
Producer: Eric van Bemmel
Audio Engineer: Gavin Nebauer
Voiceover: Louise Bennet
Series Creators: Kelvin Param, Eric van Bemmel

This podcast was first published on 11 March 2016 by The University of Melbourne’s Up Close. Listen to the original podcast here

Collaborate or crumble

Bookshelves in offices around Australia groan under the weight of unimplemented reports of research findings. Likewise, the world of technology is littered with software and gadgetry that has failed to gain adoption, for example 3D television and the Apple Newton. But it doesn’t have to be this way.

Adoption of research is a critical success measure for Cooperative Research Centres (CRCs). One CRC in particular, the CRC for Water Sensitive Cities, has succeeded in having its research adopted by governments, companies and even the United Nations. Its secret is fruitful collaborations spanning diverse academic disciplines to deliver usable results. These are the kind of collaborations CRCs are well placed to deliver, argues Professor Rebekah Brown, project leader and former Chief Research Officer of the CRC for Water Sensitive Cities and director of the Monash Sustainability Institute.

The best are not always adopted. To change that, says Brown, developers need to know how their research solutions will be received and how to adapt them so people actually want them.

“Physical scientists, for example, benefit from understanding the political, social and economic frameworks they’re operating in, to position the science for real-world application,” she says.

The big-picture questions around knowledge and power, disadvantage and information access, political decision-making, community needs and aspirations, policy contexts and how values are economised – these are the domains of the social sciences. When social science expertise is combined with that of the physical sciences, for example, the research solutions they produce can have a huge impact. In the case of the CRC for Water Sensitive Cities, such solutions have influenced policy, strategy and regulations for the management of urban stormwater run-off, for example. Brown and her colleagues have found it takes a special set of conditions to cultivate this kind of powerful collaborative research partnership.

The CRC for Water Sensitive Cities has seen considerable growth. It started in 2005 as a $4.5 million interdisciplinary research facility with 20 Monash University researchers and PhD students from civil engineering, ecology and sociology. The facility grew over seven years to become a $120 million CRC with more than 85 organisations, including 13 research institutes and other organisations such as state governments, water utilities, local councils, education companies and sustainability consultancies. It has more than 230 researchers and PhD students, and its work has been the basis for strategy, policy, planning and technology in Australia, Singapore, China and Israel.

in text green corridor
Blue and green corridors in urban areas are part of the CRC for Water Sensitive Cities’ research into managing water as the world becomes increasingly urbanised.

In a 2015 Nature special issue, Brown and Monash University colleagues Ana Deletic and Tony Wong, project leader and CEO respectively of the CRC for Water Sensitive Cities, shared their ‘secret sauce’ on bridging the gap between the social and biophysical sciences, which allowed them to develop a partnership blueprint for implementing urban water research.


8 tips to successful collaboration

Rebekah Brown
Professor Rebekah Brown, courtesy of the Monash Sustainability Institute

Cultivating interdisciplinary dialogue and forming productive partnerships takes time and effort, skill, support and patience. Brown and her colleagues suggest the following:

1 Forge a shared mission to provide a compelling account of the collaboration’s overall goal and to maintain a sense of purpose for all the time and effort needed to make it work;

2 Ensure senior researchers are role models, contributing depth in their discipline and demonstrating the skills needed for constructive dialogue;

3 Create a leadership team composed of people from multiple disciplines;

4 Put incentives in place for interdisciplinary research such as special funding, promotion and recognition;

5 Encourage researchers to put their best ideas forward, even if unfinished, while being open to alternative perspectives;

6 Develop constructive dialogue skills by providing training and platforms for experts from diverse disciplines and industry partners to workshop an industry challenge and find solutions together;

7 Support colleagues as they move from being I-shaped to T-shaped researchers, prioritising depth early on and embracing breadth by building relationships with those from other fields;

8 Run special issues of single-discipline journals that focus on interdisciplinary research and create new interdisciplinary journals with T-shaped editors, peer-reviewers or boards.

Source: Brown, R.R, Deletic, A. and Wong, T.H.F (2015), How to catalyse collaboration, Nature, 525, pp. 315-317.


A recent Stanford University study found almost 75% of cross-functional teams within a single business fail. Magnify that with PhD research and careers deeply invested in niche areas and ask people to work with other niche areas across other organisations, and it all sounds impossible. Working against resistance to collaborate requires time and effort.

Yet as research partnerships blossom, so do business partnerships. “Businesses don’t think of science in terms of disciplines as scientists do,” says Brown. “Researchers need to be able to tackle complex problems from a range of perspectives.”

Part of the solution lies in the ‘shape’ of the researchers: more collaborative interdisciplinary researchers are known as ‘T-shaped’ because they have the necessary depth of knowledge within their field (the vertical bar of the T), as well as the breadth (the horizontal bar) to look beyond it as useful collaborators – engaging with different ways of working.

Some scholars, says Brown, tend to view their own discipline as having the answer to every problem and see other disciplines as being less valuable. In some ways that’s not surprising given the lack of exposure they may have had to other disciplines and the depth of expertise they have gained in their own.

“At the first meeting of an interdisciplinary team, they might try to take charge, for example talk but not listen to others or understand their contribution. But in subsequent meetings, they begin to see the value the other disciplines bring – which sometimes leaves them spellbound.

“It’s very productive once people reach the next stage in a partnership where they develop the skills for interdisciplinary work and there’s constructive dialogue and respect,” says Brown.

In a recent article in The Australian, CSIRO chief executive and laser physicist Dr Larry Marshall describes Australians as great inventors but he emphasises that innovation is a team sport and we need to do better at collaboration. He points out that Australia has the lowest research collaboration rates in the Organization for Economic Cooperation and Development (OECD), and attributes this fact to two things – insufficient collaboration with business and scientists competing against each other.

“Overall, our innovation dilemma – fed by our lack of collaboration – is a critical national challenge, and we must do better,” he says.

Brown agrees, saying sustainability challenges like those addressed by the CRC for Water Sensitive Cities are “grand and global challenges”.

“They’re the kind of ‘wicked problem’ that no single agency or discipline, on its own, could possibly hope to resolve.”

The answer, it seems, is interdisciplinary.


Moving forward

Alison Mitchell
Alison Mitchell, courtesy of Vitae

There’s a wealth of great advice that CRCs can tap into. For example the Antarctic Climate & Ecosystems CRC approached statistical consultant Dr Nick Fisher at ValueMetrics Australia, an R&D consultancy specialising in performance management, to find the main drivers of the CRC’s value as perceived by its research partners, so the CRC could learn what was working well and what needed to change.

Fisher says this kind of analysis can benefit CRCs at their formation, and can be used for monitoring and in the wind-up phase for final evaluation.

When it comes to creating world-class researchers who are T-shaped and prepped for research partnerships, Alison Mitchell, a director of Vitae, a UK-based international program dedicated to professional and career development for researchers, is an expert. She describes the Vitae Researcher Development Framework (RDF), which is a structured model with four domains covering the knowledge, behaviour and attributes of researchers, as a significant approach that’s making a difference to research careers worldwide.

The RDF framework uses four ‘lenses’ – knowledge exchange, innovation, intrapreneurship [the act of behaving like an entrepreneur while working with a large organisation] and entrepreneurship – to focus on developing competencies that are part and parcel of a next generation research career. These include skills for working with academic research partners and industry.


– Carrie Bengston

watersensitivecities.org.au

www.acecrc.org.au

Dr Alan Finkel will be Australia’s new Chief Scientist

Featured photo: Greg Ford/Monash University

New Chief Scientist Dr Alan Finkel will take over the role once the sitting Chief Scientist, Professor Ian Chubb, finishes his five-year stint in the job on 31 December this year.

Finkel was most recently Chancellor of Monash University, a post he has held since 2008. He is also the President of the Australian Academy of Technological Sciences and Engineering (ATSE).

New Chief Scientist Finkel is an outspoken advocate for science awareness and popularisation. He is a patron of the Australian Science Media Centre and has helped launch popular science magazine, Cosmos.

He is also an advocate for nuclear power, arguing that “nuclear electricity should be considered as a zero-emissions contributor to the energy mix” in Australia.

The Australian Academy of Science (AAS) President, Professor Andrew Holmes, welcomes the expected appointment of Finkel to the new Chief Scientist role.

“The Academy is looking forward to the government’s announcement, but Finkel would be an excellent choice for this position. I’m confident he would speak strongly and passionately on behalf of Australian science, particularly in his advice to government,” he says.

“The AAS and ATSE have never been closer; we have worked together well on important issues facing Australia’s research community, including our recent partnership on the Science in Australia Gender Equity initiative.”

Holmes also thanked outgoing Chief Scientist for his strong leadership for science in Australia, including establishing ACOLA as a trusted source of expert, interdisciplinary advice to the Commonwealth Science Council.

“Since his appointment, Chubb has been a tireless advocate of the fundamental importance of science, technology engineering and mathematics (STEM) skills as the key to the country’s future prosperity, and a driving force behind the identification of strategic research priorities for the nation,” says Holmes.

This article was first published on The Conversation on 26 October 2015. Read the original article here.

Expert reactions:

Karen Taylor is Founder and Business Director of Refraction Media

“Finkel is an energetic advocate for STEM across all levels of society, from schools and the general public to corporate leaders. We’re excited and optimistic about the fresh approach science and innovation is enjoying.” 

Professor Emeritus Sir Gustav Nossal is Emeritus Professor in the Department of Pathology at the University of Melbourne

“This is truly the most fantastic news. Finkel is an extraordinary leader. He has proven himself in personal scientific research. He has succeeded in business in competitive fields. It is difficult to think of anyone who would do this important job with greater distinction.”

Dr Ross Smith is President of Science & Technology Australia

“Finkel has a profound understanding of the place of science in a flourishing modern economy, as a scientist, entrepreneur and science publisher of real note. We look forward to working closely with Finkel, as we jointly pursue better links between STEM and industry.”

Saving grains

Each year, the fungal disease tan spot costs the Australian economy more than half a billion dollars. Tan spot, also known as yellow spot, is the most damaging disease to our wheat crops, annually causing an estimated $212 million in lost production and requiring about $463 million worth of control measures. Fungal disease also causes huge damage to barley, Australia’s second biggest cereal crop export after wheat. It should come as no surprise, then, that the nation’s newest major agricultural research facility, Curtin University’s Centre for Crop and Disease Management (CCDM), is focusing heavily on the fungal pathogens of wheat and barley.

Launched in early 2014, with the announcement of an inaugural bilateral research agreement between Curtin and the Australian Government’s Grains Research and Development Corporation (GRDC), the CCDM already has a team of about 40 scientists, with that number expected to double by 2016.

“We are examining the interactions of plants and fungal pathogens, and ways and means of predicting how the pathogen species are going to evolve so that we might be better prepared,” says CCDM Director, Professor Mark Gibberd.

An important point of difference for the centre is that, along with a strongly relevant R&D agenda, its researchers will be working directly with growers to advise on farm practices. Influencing the development and use of faster-acting and more effective treatments is part of the CCDM’s big-picture approach, says Gibberd. This encompasses both agronomy (in-field activities and practices) and agribusiness (the commercial side of operations).

“We want to know more about the issues that challenge farmers on a day-to-day basis,” explains Curtin Business School’s John Noonan, who is overseeing the extension of the CCDM’s R&D programs and their engagement with the public. The CCDM, he explains, is also focused on showing impact and return on investment in a broader context.

Two initiatives already making a significant impact on growers’ pockets include the tan spot and Septoria nodorum blotch programs. Tan spot, Australia’s most economically significant wheat disease, is caused by the fungus Pyrenophora tritici-repentis. Septoria nodorum blotch is a similar fungal infection and Western Australia’s second most significant wheat disease.

Curtin University researchers were 2014 finalists in the Australian Museum Eureka Prize for Sustainable Agriculture for their work on wheat disease. Their research included the development of a test that enables plant breeders to screen germinated seeds for resistance to these pathogens and subsequently breed disease-resistant varieties. It’s a two-week test that replaces three years of field-testing and reduces both yield loss and fungicide use.

When fungi infect plants, they secrete toxins to kill the leaves so they can feed on the dead tissue (toxins: ToxA for tan spot, and ToxA, Tox1 and Tox3 for Septoria nodorum blotch). The test for plant sensitivity involves injecting a purified form of these toxins – 30,000 doses of which the CCDM is supplying to Australian wheat breeders annually.

“We have seen the average tan spot disease resistance rating increase over the last year or so,” says Dr Caroline Moffat, tan spot program leader. This means the impact of the disease is being reduced. “Yet there are no wheat varieties in Australia that are totally resistant to tan spot.”

“The development of fungicide resistance is one of the greatest threats to our food biosecurity, comparable to water shortage and climate change.”

Worldwide, there are eight variants of the tan spot pathogen P. tritici-repentis. Only half of them produce ToxA, suggesting there are other factors that enable the pathogen to infiltrate a plant’s defences and take hold. To investigate this, Moffat and her colleagues have deleted the ToxA gene in samples of P. tritici-repentis and are studying how it affects the plant-pathogen interaction.

During the winter wheat-cropping season, Moffat embarks on field trips across Australia to sample for P. tritici-repentis to get a ‘snapshot’ of the pathogen’s genetic diversity and how this is changing over time. Growers also send her team samples as part of a national ‘Stop the Spot’ campaign, which was launched in June 2014 and runs in collaboration with the GRDC. Of particular interest is whether the pathogen is becoming more virulent, which could mean the decimation of popular commercial wheat varieties.


Wheat fungal diseases can regularly cause a yield loss of about 15–20%. But for legumes – such as field pea, chickpea, lentil and faba bean – fungal infections can be even more devastating. The fungal disease ascochyta blight, for example, readily causes yield losses of about 75% in pulses. It makes growing pulses inherently risky, explains ascochyta blight program leader, Dr Judith Lichtenzveig.

In 1999, Western Australia’s chickpea industry was almost wiped out by the disease and has never fully recovered. With yield reliability and confidence in pulses still low, few growers include them in their crop rotations – to the detriment of soil health.

Pulse crops provide significant benefit to subsequent cereals and oilseeds in the rotation, says Lichtenzveig, because they add nitrogen and reduce the impact of soil and stubble-borne diseases. The benefits are seen immediately in the first year after the pulse is planted. The chickpea situation highlights the need to develop new profitable varieties with traits desired by growers and that suit the Australian climate.

The CCDM also runs two programs concerned with barley, both headed by Dr Simon Ellwood. His research group is looking to develop crops with genetic resistance to two diseases that account for more than half of all yield losses in this important Australian crop – net blotch and powdery mildew.

Details of the barley genome were published in the journal Nature in 2012. The grain contains about 32,000 genes, including ‘dominant R-genes’ that provide mildew resistance. The dominant R-genes allow barley plants to recognise corresponding avirulence (Avr) genes in mildew; if there’s a match between a plant R-gene and pathogen Avr genes, the plant mounts a defence response and the pathogen is unable to establish an infection. It’s relatively commonplace, however, for the mildew to alter its Avr gene so that it’s no longer recognised by the plant R-gene.

“This is highly likely when a particular barley variety with a given R-gene is grown over a wide area where mildew is prevalent, as there is a high selection pressure on mutations to the Avr gene,” explains Ellwood. This means the mildew may become a form that is unrecognised by the barley.

Many of the malting barley varieties grown in Western Australia, with the exception of Buloke, are susceptible to mildew. This contrasts with spring barley varieties being planted in Europe and the USA that have been bred to contain a gene called mlo, which provides resistance to all forms of powdery mildew.

Resistance to net blotch also occurs on two levels in barley. “As with mildew, on the first level, barley can recognise net blotch Avr genes early on through the interaction with dominant R-genes. But again, because resistance is based on a single dominant gene interaction, it can be readily lost,” says Ellwood. “If the net blotch goes unrecognised, it secretes toxins that allow the disease to take hold.”

On the second level, these toxins interact with certain gene products so that the plant cells become hypersensitised and die. By selecting for barley lines without the sections of genes that make these products, the crop will have a durable form of resistance. Indeed, Ellwood says his team has found barley lines with these characteristics. The next step is to determine how many genes control this durable resistance. “Breeding for host resistance is cheaper and more environmentally friendly than applying fungicides,” Ellwood adds.

“This is a massive achievement, and we have already shown that the use of more expensive chemicals can be justified on the basis of an increase in crop yield.”


Numerous fungicides are used to prevent and control fungal pathogens, and they can be costly. Some have a common mode of action, and history tells us there’s a good chance they’ll become less effective the more they’re used. “The development of fungicide resistance is one of the greatest threats to our food biosecurity ahead of water shortage and climate change,” says Gibberd. “It’s a very real and current problem for us.”

Fungicides are to grain growers what antibiotics are to doctors, explains Dr Fran Lopez-Ruiz, head of the CCDM’s fungicide resistance program. “The broad-spectrum fungicides are effective when used properly, but if the pathogens they are meant to control start to develop resistance, their value is lost.” Of the three main types of leaf-based fungicides used for cereal crops, demethylation inhibitors (DMIs) are the oldest, cheapest and most commonly used.

Lopez-Ruiz says that to minimise the chance of fungi becoming resistant, sprays should not be used year-in, year-out without a break. The message hasn’t completely penetrated the farming community and DMI-resistance is spreading in Australia. A major aim within Lopez-Ruiz’s program is to produce a geographical map of fungicide resistance. “Not every disease has developed resistance to the available fungicides yet, which is a good thing,” says Lopez-Ruiz.

DMIs target an enzyme called CYP51, which makes a cholesterol-like compound called ergosterol that is essential for fungal cell survival. Resistance develops when the pathogens accumulate several mutations in their DNA that change the structure of CYP51 so it’s not affected by DMIs.

In the barley disease powdery mildew in WA, a completely new set of mutations has evolved, resulting in the emergence of fungicide-resistant populations. The first of these mutations has just been identified in powdery mildew in Australia’s eastern states, making it essential that growers change their management tactics to prevent the development of full-blown resistance. Critical messages such as these are significant components of John Noonan’s communications programs.

tan spot

tan spot

tan spot
The CCDM is researching solutions to plant diseases such as powdery mildew in barley (above top), and Septoria nodorum blotch (above middle) in wheat, with Dr Caroline Moffat (above bottom) leading a program to tackle the wheat tan spot fungus.

Resistance to another group of fungicides, Qols, began to appear within two years of their availability here. They are, however, still widely used in a mixed treatment, which hinders the development of resistance. Lopez-Ruiz says it’s important we don’t end up in a situation where there’s no solution: “It’s not easy to develop new compounds every time we need them, and it’s expensive – more than $200 million to get it to the growers”.

The high cost of testing and registering products can deter companies from offering their products to Australian growers – particularly if, as in the case of legumes, the market is small.

To help convince the Australian Pesticides and Veterinary Medicines Authority that it should support the import and use of chemicals that are already being safely used overseas, the CCDM team runs a fungicide-testing project for companies to trial their products at sites where disease pressures differ – for example, because of climate. This scheme helps provide infrastructure and data to fast-track chemical registrations.

“This is a massive achievement, and we have already shown that the use of more expensive chemicals can be justified on the basis of an increase in crop yield.”


A global problem

More than half of Australia’s land area is used for agriculture – 8% of this is used for cropping, and much of the rest for activities such as forestry and livestock farming. Although Australia’s agricultural land area has decreased by 15% during the past decade, from about 470 million to 397 million ha, it’s more than enough to meet current local demand and contribute to international markets.

Nevertheless, the world’s population continues to grow at a rapid rate, increasing demands for staple food crops and exacerbating food shortages. Australia is committed to contributing to global need and ensuring the sustained viability of agriculture. To this end, Professor Richard Oliver, Chief Scientist of Curtin’s Centre for Crop and Disease Management (CCDM), has established formal relationships with overseas institutions sharing common goals (see page 26). This helps CCDM researchers access a wider range of relevant biological resources and keep open international funding opportunities, particularly in Europe.

“The major grant bodies have a very good policy around cereal research where the results are freely available,” says Oliver. “There’s also the possibility to conduct large experiments requiring lots of space – either within glasshouses or in-field – which would be restricted or impossible in Australia.” It’s a win-win situation.

Branwen Morgan

 

Four things to protect yourself from cyberattack

It’s easy to get lost in a sea of information when looking at cybersecurity issues – hearing about hacks and cyberattacks as they happen is a surefire way to feel helpless and totally disempowered.

What follows is a sort of future shock, where we become fatalistic about the problem. After all, 86% of organisations from around the world surveyed by PwC reported exploits of some aspect of their systems within a one year period. That represented an increase of 38% on the previous year.

However, once the situation comes into focus, the problem becomes much more manageable. There are a range of things that can we can easily implement to reduce the risk of an incident dramatically.

For example, Telstra estimates that 45% of security incidents are the result of staff clicking on malicious attachments or links within emails. Yet that is something that could be fairly easily fixed.

Confidence gap

There is currently a gap between our confidence in what we can do about security and the amount we can actually do about it. That gap is best filled by awareness.

Many organisations, such as the Australian Centre for Cyber Security, American Express and Distil Networks provide basic advice to help us cope with future shock and start thinking proactively about cybersecurity.

The Australia Signals Directorate (ASD) – one of our government intelligence agencies – also estimates that adhering to its Top Four Mitigation Strategies would prevent at least 85% of targeted cyberattacks.

So here are some of the top things you can do to protect yourself from cyberattack:

1 Managed risk

First up, we need to acknowledge that there is no such thing as perfect security. That message might sound hopeless but it is true of all risk management; some risks simply cannot be completely mitigated.

However, there are prudent treatments that can make risk manageable. Viewing cybersecurity as a natural extension of traditional risk management is the basis of all other thinking on the subject, and a report by CERT Australia states that 61% of organisations do not have cybersecurity incidents in their risk register.

ASD also estimates that the vast majority of attacks are not very sophisticated and can be prevented by simple strategies. As such, think about cybersecurity as something that can managed, rather than cured.

2 Patching is vital

Patching is so important that ASD mentions it twice on its top four list. Cybersecurity journalist Brian Krebs say it three times: “update, update, update”.

Update your software, phone and computer. As a rule, don’t use Windows XP, as Microsoft is no longer providing security updates.

Updating ensures that known vulnerabilities are fixed and software companies employ highly qualified professionals to develop their patches. It is one of the few ways you can easily leverage the cybersecurity expertise of experts in the field.

3 Restricting access means restricting vulnerabilities

The simple rule to protect yourself from cyberattack is: don’t have one gateway for everything. If all it takes to get into the core of a system is one password, then all it takes is one mistake for the gate to be opened.

Build administrator privileges into your system so that people can only use what they are meant to. For home businesses it could mean something as simple as having separate computers for home and work, or not giving administrator privileges to your default account.

It could also be as simple as having a content filter on employee internet access so they don’t open the door when they accidentally click on malware.

4 Build permissions from the bottom up

Application whitelisting might sound complicated, but what it really means is “deny by default”: it defines, in advance, what is allowed to run and ensures that nothing else will.

Most people think of computer security as restricting access, but whitelisting frames things in opposite terms and is therefore much more secure. Most operating systems contain whitelisting tools that are relatively easy to use. When used in conjunction with good advice, the result is a powerful tool to protect a network.

The Australian Signals Directorate released a video in 2012 with an overview of cyber threats.

Protect yourself from cyberattack: Simple things first

Following these basic rules covers the same ground as ASD’s top four mitigation strategies and substantially lowers vulnerability to protect yourself from cyberattack. If you want to delve deeper, there are more tips on the ASD site.

There are many debates that will follow on from this, such as: developing a national cybersecurity strategy; deciding if people should have to report an incident; the sort of insurance that should be available; what constitutes a proportionate response to an attack; and a whole range of others.

Each of those debates is underpinned by a basic set of information that needs to be implemented first. Future shock is something that can be overcome in this space, and there are relatively simple measures that can be put into place in order to make us more secure. Before embarking on anything complicated, you should at least get these things right to protect yourself from cyberattack.

This article was first published by The Conversation on 16 October 2015. Read the original article here.

Supercontinent Revolution

Professor of geology at Curtin University Dr Zheng-Xiang Li considers himself a very lucky man. Born in a village in Shandong Province, East China, he fondly remembers the rock formations in the surrounding hills. But he was at school during the end of the Cultural Revolution – a time when academic pursuit was frowned upon and it was very hard to find good books to read. “Fortunately, I had some very good teachers who encouraged my curiosity,” recalls Li.

He went on to secure a place at the prestigious Peking University to study geology and geophysics. And in 1984, when China’s then leader Deng Xiaoping sent a select number of students overseas, Li took the opportunity to study for a PhD in Australia. With an interest in plate tectonics and expertise in palaeomagnetism, he’s since become an authority on supercontinents.

It is widely accepted that the tectonic plates – which carry the continents – are moving, and that a supercontinent, Pangaea, existed 320–170 million years ago. Li’s research
is aimed at understanding how ‘Earth’s engine’ drives the movement of the plates.

His work has been highly influential, showing that another supercontinent, Rodinia, formed about 600 million years before Pangaea. And evidence is mounting that there was yet another ancient supercontinent before that, known as Nuna, which assembled about 1600 million years ago.

Li suspects there is a cycle wherein supercontinents break up and their components then disperse around the globe, before once again coming together as a new supercontinent.

“The supercontinent cycle is probably around 600 million years. We are in the middle of a cycle: halfway between Pangaea and a fresh supercontinent,” he says.

“We are at the start of another geological revolution. Plate tectonics revolutionised geology in the 1960s. I think we are now in the process of another revolution,” Li adds, undoubtedly excited by his work.

“The meaning of life can be described by three words beginning with ‘F’ – family, friends and fun,” he says. “And for me, work falls in the fun part.”

Clare Pain

Fuelling the future

The complex engineering that drives renewable energy innovation, global satellite navigation, and the emerging science of industrial ecology is among Curtin University’s acknowledged strengths. Advanced engineering is crucial to meeting the challenges of climate change and sustainability. Curtin is addressing these issues in several key research centres.

Bioenergy, fuel cells and large energy storage systems are a focus for the university’s Fuels and Energy Technology Institute (FETI), launched in February 2012. The institute brings together a network of more than 50 researchers across Australia, China, Japan, Korea, Denmark and the USA, and has an array of advanced engineering facilities and analytic instruments. It also hosts the Australia-China Joint Research Centre for Energy, established in 2013 to address energy security and emissions reduction targets for both countries. 

Curtin’s Sustainable Engineering Group (SEG) has been a global pioneer in industrial ecology, an emerging science which tracks the flow of resources and energy in industrial areas, measures their impact on the environment and works out ways to create a “circular economy” to reduce carbon emissions and toxic waste.

And in renewable energy research, Curtin is developing new materials for high temperature fuel cell membranes, and is working with an award-winning bioenergy technology that will use agricultural crop waste to produce biofuels and generate electricity.


Solar’s big shot

Curtin’s hydrogen storage scientists are involved in one of the world’s biggest research programs to drive down the cost of solar power and make it competitive with other forms of electricity generation such as coal and gas. They are contributing to the United States SunShot Initiative – a US$2 billion R&D effort jointly funded by the US Department of Energy and private industry partners to fast track technologies that will cut the cost of solar power, including manufacturing for solar infrastructure and components.

SunShot was launched in 2011 as a key component of President Obama’s Climate Action Plan, which aims to double the amount of renewable energy available through the grid and reduce the cost of large-scale solar electricity by 75%.

Professor Craig Buckley, Dean of Research and Professor of Physics at Curtin’s Faculty of Science and Engineering, is the lead investigator on an Australian Research Council Linkage Project on energy storage for Concentrating Solar Power (CSP), and a chief investigator with the SunShot CSP program. His team at Curtin’s Hydrogen Storage Research Group is using metal hydrides to develop a low cost hydrogen storage technology for CSP thermal energy plants such as solar power towers.

CSP systems store energy in a material called molten salts – a mixture of sodium nitrate and potassium nitrate, which are common ingredients in plant fertilisers. These salts are heated to 565°C, pumped into an insulated storage tank and used to produce steam to power a turbine to generate electricity. But it’s an expensive process. The 195 m tall Crescent Dunes solar power tower in Nevada – one of the world’s largest and most advanced solar thermal plants – uses 32,000 tonnes of molten salt to extend operating hours by storing thermal energy for 10 hours after sunset.

Metal hydrides – compounds formed by bonding hydrogen with a material such as calcium, magnesium or sodium – could replace molten salts and greatly reduce the costs of building and operating solar thermal power plants. Certain hydrides operate at higher temperatures and require smaller storage tanks than molten salts. They can also be reused for up to 25 years.

At the Nevada plant, molten salt storage costs an estimated $150 million, – around 10–15% of operation costs, says Buckley. “With metal hydrides replacing molten salts, we think we can reduce that to around $50–$60 million, resulting in significantly lower operation costs for solar thermal plants,” he says. “We already have a patent on one process, so we’re in the final stages of testing the properties of the process for future scale-up. We are confident that metal hydrides will replace molten salts as the next generation thermal storage system for CSP.”


From biomass to fuel

John Curtin Distinguished Professor Chun-Zhu Li is lead researcher on a FETI project that was awarded a grant of $5.2 million by the Australian Renewable Energy Agency in 2015 to build a pilot plant to test and commercialise a new biofuel technology. The plant will produce energy from agricultural waste such as wheat straw and mallee eucalypts from wheatbelt farm forestry plantations in Western Australia.

“These bioenergy technologies will have great social, economic and environmental benefits,” says Li. “It will contribute to the electricity supply mix and also realise the commercial value of mallee plantations for wheatbelt farmers. It will make those plantations an economically viable way of combating the huge environmental problem of dryland salinity in WA.”

Li estimates that WA’s farms produce several million tonnes of wheat straw per year, which is discarded as agricultural waste. Biomass gasification is a thermochemical process converting biomass feedstock into synthesis gas (syngas) to generate electricity using gas engines or other devices.

One of the innovations of the biomass gasification technology developed at FETI is the destruction of tar by char or char-supported catalysts produced from the biomass itself. Other biomass gasification systems need water-scrubbing to remove tar, which also generates a liquid waste stream requiring expensive treatment, but the technology developed by Li’s team removes the tar without the generation of any wastes requiring disposal. This reduces construction and operation costs and makes it an ideal system for small-scale power generation plants in rural and remote areas.

Li’s team is also developing a novel technology to convert the same type of biomass into liquid fuels and biochar. The combined benefits of these bioenergy/biofuel technologies could double the current economic GDP of WA’s agricultural regions, Li adds. future scale-up. We are confident that metal hydrides will replace molten salts as the next generation thermal storage system for CSP.”


Keeping renewables on grid

Professor Syed Islam is a John Curtin Distinguished Professor with Curtin’s School of Electrical Engineering and Computing. It’s the highest honour awarded by the university to its academic staff and recognises outstanding contributions to research and the wider community. Islam has published widely on grid integration of renewable energy sources and grid connection challenges. In 2011, he was awarded the John Madsen Medal by Engineers Australia for his research to improve the prospect of wind energy generation developing grid code enabled power conditioning techniques.

Islam explains that all power generators connected to an electricity network must comply with strict grid codes for the network to operate safely and efficiently. “The Australian Grid Code specifically states that wind turbines must be capable of uninterrupted operation, and if electrical faults are not immediately overridden, the turbines will be disconnected from the grid,” he says.

“Wind energy is a very cost effective renewable technology. But disturbances and interruptions to power generation mean that often wind farms fall below grid code requirements, even when the best wind energy conversion technology is being used.”

Islam has led research to develop a system that allows a faster response by wind farm voltage control technologies to electrical faults and voltage surges. It has helped wind turbine manufacturers meet grid regulations, and will also help Australia meet its target to source 20% of electricity from renewable energy by 2020.

Islam says micro-grid technology will also provide next-generation manufacturing opportunities for businesses in Australia. “There will be new jobs in battery technology, in building and operating micro-grids and in engineering generally,” he says.

“By replacing the need for platinum catalysts, we can make fuel cells much cheaper and more efficient, and reduce dependence on environmentally damaging fossil fuels.”


Cutting fuel cell costs

Professor San Ping Jiang from FETI and his co-researcher Professor Roland De Marco at University of the Sunshine Coast in Queensland recently received an Australian Research Council grant of $375,000 to develop a new proton exchange membrane that can operate in high-temperature fuel cells. It’s a materials engineering breakthrough that will cut the production costs of fuel cells, and allow more sustainable and less polluting fuels such as ethanol to be used in fuel cells.

Jiang, who is based at Curtin’s School of Chemical and Petroleum Engineering, has developed a silica membrane that can potentially operate at temperatures of up to 500°C. Fuel cells directly convert chemical energy of fuels suchas hydrogen, methanol and ethanol into electricity and provide a lightweight alternative to batteries, but they are currently limited in their application because conventional polymer-based proton exchange membranes perform most efficiently at temperatures below 80°C. Jiang has developed a membrane that can operate at 500°C using heteropoly acid functionalised mesoporous silica – a composite that combines high proton conductivity and high structural stability to conduct protons in fuel cells. His innovation also minimises the use of precious metal catalysts such as platinum in fuel cells, reducing the cost.

“The cost of platinum is a major barrier to the wider application of fuel cell technologies,” Jiang says. “We think we can reduce the cost significantly, possibly by up to 90%, by replacing the need for platinum catalysts. It will make fuel cells much cheaper and more efficient, and reduce dependence on environmentally damaging fossil fuels.”

He says the high temperature proton exchange membrane fuel cells can be used in devices such as smartphones and computers, and in cars, mining equipment and communications in remote areas.


Doing more with less

The SEG at Curtin University has been involved in energy efficiency and industrial analysis for just over 15 years. It’s been a global leader in an emerging area of sustainability assessment known as industrial ecology, which looks at industrial areas as ‘ecosystems’ that can develop productive exchanges of resources.

Associate Professor Michele Rosano is SEG’s Director and a resource economist who has written extensively on sustainability metrics, charting the life cycles of industrial components, carbon emission reduction and industrial waste management. They’re part of a process known as industrial symbiosis – the development of a system for neighbouring industries to share resources, energies and by-products. “It’s all about designing better industrial systems, and doing more with less,” Rosano says.

Curtin and SEG have been involved in research supported by the Australian’s Government’s Cooperative Research Centres Program to develop sustainable technologies and systems for the mineral processing industry at the Kwinana Industrial Area, an 8 km coastal industrial strip about 40 km south of Perth. The biggest concentration of heavy industries in Western Australia, Kwinana includes oil, alumina and nickel refineries, cement manufacturing, chemical and fertiliser plants, water treatment utilities and a power station that uses coal, oil and natural gas.

Rosano says two decades of research undertaken by Curtin at Kwinana is now recognised as one of the world’s largest and most successful industrial ecology projects. It has created 49 industrial symbiosis projects, ranging from shared use of energy and water to recovery and reuse of previously discarded by-products.

“These are huge and complex projects which have produced substantial environmental and economic benefits,” she says. “Kwinana is now seen as a global benchmark for the way in which industries can work together to reduce their footprint.”

An example of industrial synergies is waste hydrochloric acid from minerals processing being reprocessed by a neighbouring chemical plant for reuse in rutile quartz processing. The industrial ecology researchers looked at ways to reuse a stockpile of more than 1.3 million tonnes of gypsum, which is a waste product from the manufacture of phosphate fertiliser and livestock feeds. The gypsum waste is used by Alcoa’s alumina refinery at Kwinana to improve soil stability and plant growth in its residue areas.

The BP oil refinery at Kwinana also provides hydrogen to fuel Perth’s hydrogen fuel-cell buses. The hydrogen is produced by BP as a by-product from its oil refinery and is piped to an industrial gas facility that separates, cleans and pressurises it. The hydrogen is then trucked to the bus depot’s refuelling station in Perth.

Rosano says 21st century industries “are serious about sustainability” because of looming future shortages of many raw materials, and also because research has demonstrated there are social, economic and environmental benefits to reducing greenhouse emissions.

“There is a critical need for industrial ecology, and that’s why we choose to focus on it,” she says. “It’s critical research that will be needed to save and protect many areas of the global economy in future decades.”


in text

Planning for the future

Research by Professor Peter Teunissen and Dr Dennis Odijk at Curtin’s Department of Spatial Sciences was the first study in Australia to integrate next generation satellite navigation systems with the commonly used and well-established Global Positioning System (GPS) launched by the United States in the 1990s.

Odijk says a number of new systems are being developed in China, Russia, Europe, Japan, and India, and it’s essential they can interact successfully. These new Global Navigation Satellite Systems (GNSS) will improve the accuracy and availability of location data, which will in turn improve land surveying for locating mining operations and renewable energy plants.

“The new systems have an extended operational range, higher power and better modulation. They are more robust and better able to deal with challenging situations like providing real-time data to respond to bushfires and other emergencies,” says Odijk.

“When these GNSS systems begin operating over the next couple of years, they will use a more diverse system of satellites than the traditional GPS system. The challenge will be to ensure all these systems can link together.”

Integrating these systems will increase the availability of data, “particularly when the signals from one system might be blocked in places like open-pit mines or urban canyons – narrow city streets with high buildings on both sides.”

Teunissen and Odijk’s research on integrating the GNSS involves dealing with the complex challenges of comparing estimated positions from various satellites, as well as inter-system biases, and developing algorithms. The project is funded by the Cooperative Research Centre for Spatial Information, and includes China’s BeiDou Navigation Satellite System, which is now operating across the Asia-Pacific region.

Rosslyn Beeby

Celebrating Australian succcess

Success lay with the University of Melbourne, which won Best Commercial Deal for the largest biotech start-up in 2014; the Melbourne office of the Defence Science and Technology Group, which won Best Creative Engagement Strategy for its ‘reducing red tape’ framework; and Swinburne University for the People’s Choice Award.

“These awards recognise research organisations’ success in creatively transferring knowledge and research outcomes into the broader community,” said KCA Executive Officer, Melissa Geue.

“They also help raise the profile of research organisations’ contribution to the development of new products and services which benefit wider society and sometimes even enable companies to grow new industries in Australia.”

Details of the winners are as follows:

The Best Commercial deal is for any form of commercialisation in its approach, provides value-add to the research institution and has significant long term social and economic impact:

University of Melbourne – Largest bio tech start-up for 2014

This was for Australia’s largest biotechnology deal in 2014 which was Shire Plc’s purchase of Fibrotech Therapeutics P/L – a University of Melbourne start-up – for US$75 million upfront and up to US$472m in following payments. Fibrotech develops novel drugs to treat scarring prevalent in chronic conditions like diabetic kidney disease and chronic kidney disease. This is based on research by Professor Darren Kelly (Department of Medicine St. Vincent’s Hospital).

Shire are progressing Fibrotech’s lead technology through to clinical stages for Focal segmental glomerulosclerosis, which is known to affect children and teenagers with kidney disease. The original Fibrotech team continues to develop the unlicensed IP for eye indications in a new start-up OccuRx P/L.

Best Creative Engagement Strategy showcases some of the creative strategies research organisations are using to engage with industry partner/s to share and create new knowledge:

Defence Science and Technology Group –Defence Science Partnerships (DSP) reducing red tape with a standardised framework

The DSP has reduced transaction times from months to weeks with over 300 agreements signed totalling over $16m in 2014-15. The DSP is a partnering framework between the Defence Science Technology Group of the Department of Defence and more than 65% of Australian universities. The framework includes standard agreement templates for collaborative research, sharing of infrastructure, scholarships and staff exchanges, simplified Intellectual Property regimes and a common framework for costing research. The DSP was developed with the university sector in a novel collaborative consultative approach.

The People’s Choice Awards is open to the wider public to vote on which commercial deal or creative engagement strategy project deserves to win. The winner this year, who also nabbed last years’ award is:

Swinburne University of Technology – Optical data storage breakthrough leads the way to next generation DVD technology – see DVDs are the new cool tech

Using nanotechnology, Swinburne Laureate Fellowship project researchers Professor Min Gu, Dr Xiangping Li and Dr Yaoyu Cao achieved a breakthrough in data storage technology and increased the capacity of a DVD from a measly 4.7 GB to 1,000 TB. This discovery established the cornerstone of a patent pending technique providing solutions to the big data era. In 2014, start-up company, Optical Archive Inc. licensed this technology. In May 2015, Sony Corporation of America purchased the start-up, with knowledge of them not having any public customers or a final product in the market. This achievement was due to the people, the current state of development and the intellectual property within the company.

This article was shared by Knowledge Commercialisation Australia on 11 September 2015. 

New biosecurity centre to stop fruit flies

Upgraded bio-security measures to combat fruit fly will be introduced in Australia, bringing added confidence to international trade markets.

South Australia is the only mainland state in Australia that is free from fruit flies – a critical component of the horticultural industries’ successful and expanding international export market.

A new national Sterile Insect Technology facility in Port Augusta, located in the north of South Australia, will produce billions of sterile male fruit flies – at the rate of 50 million a week – to help prevent the threat of fruit fly invading the state.

The new measures will help secure producers’ access to important citrus and almond export markets including the United States, New Zealand and Japan, worth more than $800 million this year.

The Sterile Insect Technique (SIT) introduces sterile flies into the environment that then mate with the wild population, ensuring offspring are not produced.

Macquarie University Associate Professor Phil Taylor says the fly, know as Qfly because they come from Queensland, presents the most difficult and costly biosecurity challenge to market access for most Australian fruit producers.

“Fruit flies, especially the Queensland fruity fly, present a truly monumental challenge to horticultural production in Australia,” he says.

“For generations, Australia has relied on synthetic insecticides to protect crops, but these are now banned for many uses. Environmentally benign alternatives are needed urgently – this is our goal.

The impetus behind this initiative is to secure and improve trade access both internationally and nationally for South Australia.

It will increase the confidence of overseas buyers in the Australian product and make Australia a more reliable supplier. Uncertainty or variation of quality of produce would obviously be a concern for our trading partners.”

South Australia’s Agriculture Minister Leon Bignell says the $3.8 million centre would produce up to 50 million sterile male Qflies each week.

“The State Government has invested $3 million and Horticulture Innovation Australia Ltd (HIA) has contributed $800,000 in this project and construction is expected to take 10 months,” Bignell says.

“While fruit fly is a major problem with horticultural crops in Australia’s other mainland states, South Australia remains fruit fly free, but we are still at risk of outbreak.”

“Producing male-only sterile Qflies has never been done before on this scale and this facility will have an enormous impact on the way in which we deal with outbreaks.”

Fruit fly management protects the commercial production of fruit and vegetables, including wine grapes and almonds, with an estimated farm-gate value of $851 million.

South Australia is also the only mainland state which has a moratorium on growing GM food crops and is one of the few places in the world free of the vine-destroying pest phylloxera.

“Because of these attributes, South Australian products stand out in the competitive global market, which is increasingly seeking clean and safe food and wine,” Bignell says.

The research partner consortium, SITplus, intends to invest about $50 million during the next five years to support the national fruit fly management program.

The consortium is a research group with experts from Macquarie University, Primary Industries and Regions SA’s Biosecurity SA and South Australian Research and Development Institute divisions, HIA, the CSIRO Health and Biosecurity Flagship, Plant & Food Research Australia, and the NSW Department of Primary Industries.

– John Merriman

This article was first published by The Lead South Australia on 2 September 2015. Read the original article here.

Building power by concentrating light

South Australian company HeliostatSA has partnered with Indian company Global Wind Power Limited to develop a portfolio of projects in India and Australia over the next four years. It will begin with an initial 150 megawatts in Concentrated Solar Powered (CSP) electricity in Rajashtan, Indian using a solar array.

The projects are valued at $2.5 billion and will further cement HeliostatSA as a leader in the global renewable energy sector.

Heliostat CEO Jason May says India had made a commitment to reaching an investment target of USD $100 billion of renewable energy by 2019 and has already secured $20 billion.

“India is looking for credible, renewable energy partners for utility scale projects,’’ says May.

“We bring everything to the table that they require such as size, project development experience, capital funding, field design capability, the latest technology, precision manufacturing and expertise.’’

Each solar array is made of thousands of heliostats, which are mirrors that track and reflect the suns thermal energy on to a central receiver. The energy is then converted into electricity. Each HeliostatSA mirror is 3.21 x 2.22 metres with optical efficiency believed to be the most accurate in the world. This reduces the number of mirrors required, reducing the overall cost of CSP while still delivering the same 24-hour electricity outputs.

The heliostats and their high tech components are fabricated using laser mapping and steel cutting technology.

The mirrors are slightly parabolic and components need to be cut and measured to exact requirements to achieve the strict operational performance.

“There is strong global interest in CSP with thermal storage for 24-hour power. At the moment large-scale batteries which also store electricity are very expensive. Constant advances in CSP storage technology over the next 10 years will only add to the competitive advantage,’’ says May.

– John Merriman

This article was first published by The Lead South Australia on 25 August 2015. Read the original article here.

From science fiction to reality: the dawn of the biofabricator

 

“We can rebuild him. We have the technology.”
– The Six Million Dollar Man, 1973

Science is catching up to science fiction. Last year a paralysed man walked again after cell treatment bridged a gap in his spinal cord. Dozens of people have had bionic eyes implanted, and it may also be possible to augment them to see into the infra-red or ultra-violet. Amputees can control bionic limb implant with thoughts alone.

Meanwhile, we are well on the road to printing body parts.

We are witnessing a reshaping of the clinical landscape wrought by the tools of technology. The transition is giving rise to a new breed of engineer, one trained to bridge the gap between engineering on one side and biology on the other.

Enter the “biofabricator”. This is a role that melds technical skills in materials, mechatronics and biology with the clinical sciences.


21st century career

If you need a new body part, it’s the role of the biofabricator to build it for you. The concepts are new, the technology is groundbreaking. And the job description? It’s still being written.

It is a vocation that’s already taking off in the US though. In 2012, Forbes rated biomedical engineering (equivalent to biofabricator) number one on its list of the 15 most valuable college majors. The following year, CNN and payscale.com called it the “best job in America”.

These conclusions were based on things like salary, job satisfaction and job prospects, with the US Bureau of Labour Statistics projecting a massive growth in the number of biomedical engineering jobs over the next ten years.

Meanwhile, Australia is blazing its own trail. As the birthplace of the multi-channel Cochlear implant, Australia already boasts a worldwide reputation in biomedical implants. Recent clinical breakthroughs with an implanted titanium heel and jawbone reinforce Australia’s status as a leader in the field.

The Cochlear implant has brought hearing to many people. Dick Sijtsma/Flickr, CC BY-NC
The Cochlear implant has brought hearing to many people. Dick Sijtsma/Flickr, CC BY-NC

I’ve recently helped establish the world’s first international Masters courses for biofabrication, ready to arm the next generation of biofabricators with the diverse array of skills needed to 3D print parts for bodies.

These skills go beyond the technical; the job also requires the ability to communicate with regulators and work alongside clinicians. The emerging industry is challenging existing business models.


Life as a biofabricator

Day to day, the biofabricator is a vital cog in the research machine. They work with clinicians to create a solution to clinical needs, and with biologists, materials and mechatronic engineers to deliver them.

Biofabricators are naturally versatile. They are able to discuss clinical needs pre-dawn, device physics with an electrical engineer in the morning, stem cell differentiation with a biologist in the afternoon and a potential financier in the evening. Not to mention remaining conscious of regulatory matters and social engagement.

Our research at the ARC Centre of Excellence for Electromaterials Science (ACES) is only made possible through the work of a talented team of biofabricators. They help with the conduits we are building to regrow severed nerves, to the electrical implant designed to sense an imminent epileptic seizure and stop it before it occurs, to the 3D printed cartilage and bone implants fashioned to be a perfect fit at the site of injury.

As the interdisciplinary network takes shape, we see more applications every week. Researchers have only scratched the surface of what is possible for wearable or implanted sensors to keep tabs on an outpatient’s vitals and beam them back to the doctor.

Meanwhile, stem cell technology is developing rapidly. Developing the cells into tissues and organs will require prearrangement of cells in appropriate 3D environments and custom designed bioreactors mimicking the dynamic environment inside the body.

Imagine the ability to arrange stem cells in 3D surrounded by other supporting cells and with growth factors distributed with exquisite precision throughout the structure, and to systematically probe the effect of those arrangements on biological processes. Well, it can already be done.

Those versed in 3D bioprinting will enable these fundamental explorations.


Future visions

Besides academic research, biofabricators will also be invaluable to medical device companies in designing new products and treatments. Those engineers with an entrepreneurial spark will look to start spin-out companies of their own. The more traditional manufacturing business model will not cut it.

As 3D printing evolves, it is becoming obvious that we will require dedicated printing systems for particular clinical applications. The printer in the surgery for cartilage regeneration will be specifically engineered for the task at hand, with only critical variables built into a robust and reliable machine.

The 1970s TV show, Six Million Dollar Man, excited imaginations, but science is rapidly catching up to science fiction. Joe Haupt/Flickr, CC BY-SA
The 1970s TV show, Six Million Dollar Man, excited imaginations, but science is rapidly catching up to science fiction. Joe Haupt/Flickr, CC BY-SA

Appropriately trained individuals will also find roles in the public service, ideally in regulatory bodies or community engagement.

For this job of tomorrow, we must train today and new opportunities are emerging biofab-masters-degree. We must cut across the traditional academic boundaries that slow down such advances. We must engage with the community of traditional manufacturers that have skills that can be built upon for next generation industries.

Australia is also well placed to capitalise on these emerging industries. We have a traditional manufacturing sector that is currently in flux, an extensive advanced materials knowledge base built over decades, a dynamic additive fabrication skills base and a growing alternative business model environment.

– Gordon Wallace & Cathal D. O’Connell

This article was first published by The Conversation on 31 August 2015. Read the original article here.

Australia could lead in cybersecurity research

This article is part of The Conversation’s series on the Science and Research Priorities recently announced by the Federal Government. You can read the introduction to the series by Australia’s Chief Scientist, Ian Chubb, here.


Alex Zelinsky

Chief Defence Scientist, Defence Science and Technology

The national science and research priorities have been developed with the goal of maximising the national benefit from research expenditure, while strengthening our capacity to excel in science and technology.

Cybersecurity has been identified as a research priority due to Australia’s increasing dependence on cyberspace for national well-being and security. Cyberspace underpins both commercial and government business; it is globally accessible, has no national boundaries and is vulnerable to malicious exploitation by individuals, organised groups and state actors.

Cybersecurity requires application of research to anticipate vulnerabilities, strengthen cyber systems to ward off attacks, and enhance national capability to respond to, recover from, and continue to operate in the face of a cyber-attack.

Cyberspace is a complex, rapidly changing environment that is progressed and shaped by technology and by how the global community adopts, adapts and uses this technology. Success in cyberspace will depend upon our ability to “stay ahead of the curve”.

Research will support the development of new capability to strengthen the information and communications systems in our utilities, business and government agencies against attack or damage. Investment will deliver cybersecurity enhancements, infrastructure for prototype assessment and a technologically skilled workforce.

Accordingly, priority should be given to research that will lead to:

  1. Highly secure and resilient communications and data acquisition, storage, retention and analysis for government, defence, business, transport systems, emergency and health services
  2. Secure, trustworthy and fault-tolerant technologies for software applications, mobile devices, cloud computing and critical infrastructure
  3. New technologies for detection and monitoring of vulnerabilities and intrusions in cyber infrastructure, and for managing recovery from failure. Alex Zelinsky is Chief Defence Scientist at Defence Science and Technology Organisation.
Cybersecurity is becoming an increasingly important area for research in Australia.
Cybersecurity is becoming an increasingly important area for research in Australia.

Andrew Goldsmith
Director of the Centre for Crime Policy and Research, Flinders University

Sensible science and research on cybersecurity must be premised upon informed, rather than speculative, “what if”, analysis. Researchers should not be beholden to institutional self-interest from whichever sector: government; business; universities; or security/defence agencies.

We need to be clear about what the cybersecurity threat landscape looks like. It is a variable terrain. Terms such as “cyber-terrorism” tend to get used loosely and given meanings as diverse as the Stuxnet attack and the use of the internet by disenchanted converts to learn how to build a pipe bomb.

We need to ask and answer the question: who has the interest and the capability to attack us and why?

References to “warfare” can be misleading. A lot of what we face is not “war” but espionage, crime and political protest. More than two decades into the lifecycle of the internet, we have not yet had an electronic Pearl Harbour event.

Cybersecurity depends upon human and social factors, not just technical defences. We need to know our “enemies” as well as ourselves better, in addition to addressing technical vulnerabilities.

We should be sceptical about magic bullet solutions of any kind. Good defences and secure environments depend upon cooperation across units, a degree of decentralisation, and built-in redundancy.

Andrew Goldsmith is Strategic Professor of Criminology at Flinders University.


Jodi Steel
Director, Security Business Team at NICTA

Cybersecurity is an essential underpinning to success in our modern economies.

It’s a complex area and there are no magic bullet solutions: success requires a range of approaches. The national research priorities for cybersecurity highlight key areas of need and opportunity.

The technologies we depend on in cyberspace are often not worthy of our trust. Securing them appropriately is complex and often creates friction for users and processes. Creation of secure, trustworthy and fault-tolerant technologies – security by design – can remove or reduce security friction, improving overall security posture.

Australia has some key capabilities in this area, including cross-disciplinary efforts.

The ability to detect and monitor vulnerabilities and intrusions and to recover from failure is critical, yet industry reports indicate that the average time to detect malicious or criminal attack is around six months. New approaches are needed, including improved technological approaches as well as collaboration and information sharing.

Success in translating research outcomes to application – for local needs and for export – will be greater if we are also able to create an ecosystem of collaboration and information sharing, especially in the fast-moving cybersecurity landscape.

Jodi Steel is Director, Security Business Team at NICTA.


Vijay Varadharajan
Director, Advanced Cyber Security Research Centre at Macquarie University

Cyberspace is transforming the way we live and do business. Securing cyberspace from attacks has become a critical need in the 21st century to enable people, enterprises and governments to interact and conduct their business. Cybersecurity is a key enabling technology affecting every part of the information-based society and economy.

The key technological challenges in cybersecurity arise from increased security attacks and threat velocity, securing large scale distributed systems, especially “systems of systems”, large scale secure and trusted data driven decision making, secure ubiquitous computing and pervasive networking and global participation.

In particular, numerous challenges and opportunities exist in the emerging areas of cloud computing, Internet of Things and Big Data. New services and technologies of the future are emerging and likely to emerge in the future in the intersection of these areas. Security, privacy and trust are critical for these new technologies and services.

For Australia to be a leader, it is in these strategic areas of cybersecurity that it needs to invest in research and development leading to new secure, trusted and dependable technologies and services as well as building capacity and skills and thought leadership in cybersecurity of the future.

Vijay Varadharajan is Director: Advanced Cyber Security Research Centre at Macquarie University.

Cybercrime is a growing problem, and it'll take concerted efforts to prevent it escalating further. Brian Klug/Flickr, CC-BY NC
Cybercrime is a growing problem, and it’ll take concerted efforts to prevent it escalating further. Brian Klug/Flickr, CC-BY NC

Craig Valli
Director of Security Research Institute at Edith Cowan University

ICT is in every supply chain or critical infrastructure we now run for our existence on the planet. The removal or sustained disruption of ICT as a result of lax cybersecurity is something we can no longer overlook or ignore.

The edge between cyberspace and our physical world is blurring with destructive attacks on physical infrastructure already occurring. The notion of the nation state, and its powers and its abilities to cope with these disruptions, are also significantly being challenged.

The ransacking of countries’ intellectual property by cyber-enabled actors is continuing unabated, robbing us of our collective futures. These are some of the strong indicators that currently we are getting it largely wrong in addressing cybersecurity issues. We cannot persist in developing linear solutions to network/neural security issues presented to us by cyberspace. We need change.

The asymmetry of cyberspace allows a relatively small nation state to have significant advantage in cybersecurity, Israel being one strong example. Australia could be the next nation, but not without significant, serious, long-term, collaborative investments by government, industry, academy and community in growing the necessary human capital. This initiative is hopefully the epoch of that journey.

Craig Valli is Director of Security Research Institute at Edith Cowan University.


Liz Sonenberg
Professor of Computing and Information Systems, and Pro Vice-Chancellor (Research Collaboration and Infrastructure) at University of Melbourne

There are more than two million actively trading businesses in Australia and more than 95% have fewer than 20 employees. Such businesses surely have no need for full-time cybersecurity workers, but all must have someone responsible to make decisions about which IT and security products and services to acquire.

At least historically, new technologies have been developed and deployed without sufficient attention to the security implications. So bad actors have found ways to exploit the resulting vulnerabilities.

More research into software design and development from a security perspective, and research into better tools for security alerts and detection is essential. But such techniques will never be perfect. Research is also needed into ways of better supporting human cyberanalysts – those who work with massive data flows to identify anomalies and intrusions.

New techniques are needed to enable the separation of relevant from irrelevant data about seemingly unconnected events, and to integrate perspectives from multiple experts. Improving technological assistance for humans requires a deep understanding of human cognition in the complex, mutable and ephemeral environment of cyberspace.

The cybersecurity research agenda is thus only partly a technical matter: disciplines such as decision sciences, organisational behaviour and international law all must play a part.

Liz Sonenberg is Professor, Computing and Information Systems, and Pro Vice-Chancellor (Research Collaboration and Infrastructure) at University of Melbourne.


Sven Rogge
Professor of Physics and Program Manager at the Centre for Quantum Computation & Communication Technology at UNSW

Cybersecurity is essential for our future in a society that needs to safeguard information as much as possible for secure banking, safe transportation, and protected power grids.

Quantum information technology will transform data communication and processing. Here, quantum physics is exploited for new technologies to protect, transmit and process information. Classical cryptography relies on mathematically hard problems such as factoring which are so difficult to solve that classical computers can take decades. Quantum information technology allows for an alternative approach to this problem that will lead to a solution on a meaningful timescale, such as minutes in contrast to years. Quantum information technology allows for secure encoding and decoding governed by fundamental physics which is inherently unbreakable, not just hard to break.

Internationally, quantum information is taking off rapidly underlined by large government initiatives. At the same time there are commercial investments from companies such as Google, IBM, Microsoft and Lockheed Martin.

Due to long term strategic investments in leading academic groups Australia remains at the forefront globally and enjoys a national competitive advantage in quantum computing and cybersecurity. We should utilise the fact that Australia is a world leader and global player in quantum information science to provide many new high technology industries for its future.

Sven Rogge is Professor of Physics at UNSW Australia.

This article was originally published on The Conversation and shared by Edith Cowan University on 10 July 2015. Read the original article here.


Read more in The Conversation Science and Research Priorities series.

The future of manufacturing in Australia is smart, agile and green

On the road: research can improve transport across Australia

Research priority: make Australia’s health system efficient, equitable and integratedThe Conversation

Baby immunisation: One in 10 infants at risk

Almost one in 10 Australian infants are at risk of severe infections because they are not up-to-date with their immunisations.

According to new research at the University of Adelaide in South Australia, conducted in conjunction with University College London, children with socio-economically disadvantaged parents, not just parents who disagree with baby immunisation, were more likely to not be fully immunised.

The study examined barriers to childhood immunisations experienced by parents in Australia. Overall researchers found 91% of infants were up-to-date with immunisations.

Associate Professor Helen Marshall, from the University of Adelaide’s Robinson Research Institute, and Director of Vaccinology and Immunology Research Trials Unit at the Women’s and Children’s Hospital, said this is the first Australia-wide study to show that factors associated with social disadvantage impact on immunisation uptake – more than unwillingness to have children immunised.

“In this study we looked at the most current individual-level data available of more than 5000 Australian children, aged 3–19 months,” she says.

She found that 9.3% of children were found to be partially immunised or not immunised at all, and of these only one in six children had parents who disagreed with immunisations.

“So the majority of infants who were incompletely immunised had parents who do not object to immunisation – something else is getting in the way,” she says.

Marshall says the primary barriers to immunisation included minimal contact with, and access to services, being a single parent and children living in a large household.

“Socio-economic disadvantage was an important reason why parents had children who were either partially immunised or not immunised at all,” she says.

“Children with chronic medical conditions were also more likely not to be up-to-date with immunisations. This is possibly due to parents and health care providers having a lack of knowledge about additional vaccines that are recommended for children with certain medical conditions or concerns vaccines may have adverse effects in these children,” she says.

Marshall says these findings can inform programs to increase the uptake of immunisations.

“Reminders and rescheduling of cancelled appointments, and offering immunisation in different settings may help achieve better protection for children and the community,” says Marshall.

“This research found that the majority of parents with partially immunised children are in favour of vaccinations, so we need to look at how we can remove the barriers experienced by these families.”

The research was published in the journal Vaccine.

This article was first published on 6 August 2015 by The Lead Australia. Read the original article here.

Test on chemo drugs predict side effects

A chemosensitivity test hopes to identify which chemo drugs will provide benefit and which may cause unwanted side effects for sarcoma cancer patients.

University of Western Australia’s School of Surgery researchers are currently comparing three methods to identify the most effective and reliable method to grow a patient’s tumour cells.

Co-lead researcher Dr Nicholas Calvert says sarcoma is a group of rare cancers arising from bone, muscle and cartilage.

“While they are rare, they can be very aggressive and early detection is vital to successful treatment, which can involve chemotherapy, radiotherapy, and surgical treatment,” he says.

Calvert says it is difficult to predict tumour responsiveness to chemotherapy because there are over 70 different types of sarcoma with significant variation in the genetic profile of cells within each type.

Chemotherapy in this area is generally guided by research on chemotherapy efficacy on a specific tumour type or those that are similar.

“So successfully predicting whether a patient’s tumour will be similar to another patient’s tumour of the same type is very difficult,” says Calvert.

“Especially given there are only around 1200 new cases per year which does not provide a large enough trial to test different chemotherapy regimens.”


Gene library and cell cultures methods considered

One of the methods under review involves researchers analysing DNA from tumour cells and comparing them to an international library of genes to identify whether they have any mutations that will help or prevent a chemotherapy drug from working.

Another method involves growing tumour cells in the lab and then exposing them to different chemotherapy drugs to see which kill the cells and at what dose.

Finally, mouse xenograft will be considered where tumour cells are grown in lab mice which are then subjected to different chemotherapy drugs to see which kill the cells and at what dose.

Calvert says once this pilot study is completed they will expand it to a national trial to identify which of these tests is effective and reliable to select chemotherapy drugs.

“If we can identify a test that will allow us to take a sample of tumour, and identify how it will respond to chemotherapy it will have significant benefit for not only those with sarcoma but also other cancers,” says Calvert.

He says this ‘personalised medicine’ approach aims to confirm a tumour will respond to an agent before it is even given, and avoid the significant and sometimes life-threatening side effects of some the chemotherapy agents.

Sarcoma has approximately 1200 new cases diagnosed each year in Australia and accounts for approximately 1% of all adult malignancies and 15% of paediatric malignancies.

– Teresa Belcher

This article was originally published on Science Network Western Australia. Read the original article here.

 

A new sunscreen made from fish slime and algae

Researchers have developed a new UV blocking material out of naturally occurring molecules found in algae and fish slime that can be used to make more effective sunscreen, bandages and contact lenses.

Organisms like algae and cyanobacteria have evolved to synthesise their own UV screening compounds, such as mycosporine-like amino acids (MAAs).

MAAs are commonly found in the creatures that eat algae and cyanobacteria as well – tropical fish like those found on the Great Barrier Reef accrue the material in their slime and eyes to protect themselves from harmful UV radiation.

“Mycosporines are present a little bit everywhere, in many types of organisms,” says Professor Vincent Bulone, co-author of the research paper and Director of the ARC Centre of Excellence in Plant Cell Walls at the University of Adelaide.

“We have attached these small UV absorbing molecules in a non-reversible manner to a polymer called chitosan, that you can extract from the shells of shrimp or crabs.”

The result is an all-natural UVA and UVB screening material. Thanks to the versatility of chitosan, it can be used in a cream for topical application, a transparent film for use in materials like bandages, or coated on objects like textiles and outdoor furniture to protect them from UV damage.

Current sunscreen formulas use a combination of materials in order to screen both UVA and UVB radiation, including some that can have a negative effect on health in the long-term, such as titanium dioxide.

“It outperforms some of the compounds that are already used on the market in terms of the UV absorption capacity. The good thing is that it’s completely natural. We’ve also tested them on cell cultures and know they are not toxic,”says Bulone.

“We know, under laboratory conditions, the MAAs have no harmful effects. So they can be used for wound healing dressings for instance. You don’t need to change that dressing as often and it facilitates the healing of the skin.”

The compound is also highly stable, even under high temperatures.

While chitosan is already widely used for many applications and easily extracted from crustacean waste products such as prawn shells, MAAs are more difficult to produce.

“Extracting it from algae would be a very expensive process, but it is possible to produce them by engineering bacteria. This has been since the early 90s. It’s not a cheap process, but it can be done.”

Bulone was recently installed as Director of the ARC Centre of Excellence in Plant Cell Walls at the University of Adelaide in South Australia.

“I’ve only started recently in South Australia. This work was done in my lab in Sweden. I still split my time, 70% in Adelaide and 30% in Sweden.”

Published in ACS Applied Materials & Interfaces, the research was undertaken with colleagues at Sweden’s Royal Institute of Technology. It also involved close collaboration with partners in Spain.

Bulone is actively developing new collaborations within Australia and internationally to develop new concepts leading to increased crop production and quality for nutrition as well as protection of crops against devastating fungal pathogens. These developments rely on his long-standing expertise in the biochemistry of carbohydrates from plant and fungal cell walls.

This article was first published by The Lead on 29 July 2015. Read the original article here.

Buy Vision, Give Sight

Eyewear brand Revo and U2 lead singer Bono are joining forces with the Brien Holden Vision Institute to eliminate avoidable blindness and vision impairment.

“Eye tests and eye examinations are at the front line of eye care. But for millions of people without access, the simplest problems go untreated. It’s unnecessary and avoidable,” says Kovin Naidoo, Global Director of Programs, Brien Holden Vision Institute.

When consumers purchases Revo sunglasses, $10 from the sale of every pair will be donated by Revo to the “Buy Vision, Give Sight” initiative. To execute the initiative, Revo and Bono are partnering with the Brien Holden Vision Institute to provide sustainable solutions for eye care and end avoidable blindness and vision impairment in under-resourced communities.

Bono, who has a long track record in global health, particularly as an activist in the fight against HIV/AIDS, was diagnosed with glaucoma 20 years ago. His experience with glaucoma, for which he has received excellent treatment, has made him determined to find a way to increase access to frontline eye health services for others.

bono_vision

“The ‘Buy Vision, Give Sight’ campaign is a very personal one for me,” says Bono.

“Thanks to good medical care my eyes are okay, but tens of millions of people around the world with sight problems don’t have access to glasses, or even a basic eye test. Poor eyesight may not be life-threatening, but it dramatically affects your life and your livelihood if you aren’t able to fix it.  When we met with experts, they said the number one problem is untreated poor vision, which prevents a child from learning in school, or an adult from performing their job. Sight is a human right and the ‘Buy Vision, Give Sight’ initiative will help ensure millions of people have access to the eye exams and glasses they need to see.”

Untitled-1

 

“With Brien Holden, we found a partner doing remarkable work, hand-in-hand with local communities.  It’s mind-expanding what they are achieving; we’re very excited to work in partnership with them and Revo,” says Bono.

Yehuda Shmidman, Sequential Brands Group CEO, commented, “We are very excited about this partnership. Revo’s pioneering lens technology has always put eye-health central to Revo products and we believe Revo buyers will embrace the idea that their purchase is helping someone else. We’re very proud to support Bono and the Brien Holden Vision Institute in their efforts to bring basic eye care services to millions of people around the world.”

Professor Brien Holden, CEO, Brien Holden Vision Institute says,”It is extremely helpful that Revo and Bono recognise the impact that uncorrected vision impairment has on the lives of the 625 million people globally who do not have access to a simple eye examination or pair of glasses.  Revo and Bono’s commitment to our programs will have a lasting impact on millions of lives globally.”

The funds donated by Revo will help pay for basic eye care services, particularly eye tests and prescription glasses, and also build stronger eye care services in target communities for the longer term by training local people to provide eye care and detect eye diseases.

During U2’s Innocence + Experience World Tour, Bono will exclusively wear Revo sunglasses. He has designed a capsule collection of sunglasses for the brand, available in the North American fall, which will include lenses outfitted with Revo’s LMSTM technology. As with all Revo sunglasses, $10 dollars from each pair of the Bono for Revo collection will go to the Brien Holden Vision Institute.

This article was published by the Brien Holden Vision Institute on 25 July 2015. Read the original article here.

Frog researchers help kids make great leaps in literacy

Researchers from the Far North Queensland university worked with children’s author, Emma Homes, to create a kids’ book, The Vanishing Frogs of Cascade Creek – now shortlisted for a Wilderness Society fiction prize.

“I was interested in the idea of using fictional characters to raise awareness of science. I think people remember more when you tell them a story,” said Homes.

Wildlife diseases such as chytrid fungus, which is killing frogs worldwide, can devastate animal populations, but are often not well publicised or understood by the general public.

That’s where JCU experts Dr Lee Berger and Dr Lee Skerratt came in, to help answer questions about chytrid fungus, and explain how a sick frog might be examined in the laboratory.

“Lee Berger told me about a suitable frog species to cover in the book – the waterfall frog – and its habitat in the rainforest of Northern Queensland. We went for a trip to the Daintree Rainforest together, which was helpful for the writing process,” said Homes.

Berger thinks the books are a fantastic way to educate the general public. “It’s great that these books raise awareness of wildlife disease – a neglected conservation issue.  Similar to weeds and feral animals, introduced diseases can have catastrophic effects but often go under the radar.”

The Vanishing Frogs of Cascade Creek has recently been shortlisted for the Wilderness Society’s Environment Award for Children’s Literature in the fiction category.

 Home’s second book in the ‘Ruthie’ series, Saving Wombats, is informed by Skerrat’s PhD and tackles the disease sarcoptic mange, which can affect wombats and other mammals.

This article was published by James Cook University on 20 July 2015.

JCU_Frogs_210715

 

 

Driverless car trials in South Australia

A major European carmaker will conduct the first on-road trials of driverless cars in the Southern Hemisphere in South Australia in November.

The testing by Volvo will be held in conjunction with an international conference on driverless cars in Adelaide.

Volvo will test the same vehicle being used in their “Drive Me” project in Sweden.

South Australia legalized the use of driverless cars on its roads earlier this year.

The testing is part of independent road research agency ARRB’s Australian Driverless Vehicle Initiative.

ARRB Managing Director Gerard Walton said that automated vehicles are a short-term reality that Australia needs to be prepared for.

“The South Australian Government has been quick to recognise this,” he said.

“ARRB will establish how driverless technology needs to be manufactured and introduced for uniquely Australian driving behaviour, our climate and road conditions, including what this means for Australia’s national road infrastructure, markings, surfaces and roadside signage,” said Waldon.

Volvo’s testing will be undertaken in conjunction with Flinders University, Carnegie Mellon University, the RAA and Cohda Wireless.

The Premier of South Australia, Jay Weatherill said the technology promises to not only improve safety, reduce congestion and lower emissions, but also to provide a real opportunity for South Australia to become a key player in the emerging driverless vehicle industry.

“This trial presents a fantastic opportunity for South Australia to take a lead nationally and internationally in the development of this new technology and open up new opportunities for our economy,” he said.

The driverless car trials will take place on an expressway south of the capital city of Adelaide on 7–8 November 2015.

Multiple vehicles will conduct manoeuvres such as overtaking, lane changing, emergency braking and the use of on and off ramps.

The International Driverless Cars Conference will be hosted at the Adelaide Convention Centre and Tonsley precinct on 5–6 November 2015.

This article was first published by The Lead on 21 July 2015. Read the original article here.

Closing the gap

Romlie Mokak, CEO of the Lowitja Institute for Aboriginal and Torres Strait Islander Health Research, is a man with a vision.

“We’ve got a clear agenda for the future and it’s for just 15 years ahead: 2030. This agenda has been agreed upon by governments and Aboriginal and Torres Strait Islander leadership as part of the ‘Close the Gap’ campaign,” said Mokak.

The aim is to eliminate the difference in life expectancy between Aboriginal and Torres Strait Islander people and other Australians by 2030. It’s a big ambition that will take a lot of work.

“It’s essential that solutions in Aboriginal and Torres Strait Islander health and wellbeing come from the people themselves,” he said. A vital step is explicit recognition of Indigenous people in the Australian Constitution, supported by the Recognise Health coalition launched by the Lowitja Institute in March 2015.

“If we hit the target, then by 2040 we will have had 10 years with no gap. We will have a high quality, accessible health system that is culturally appropriate for Aboriginal and Torres Strait Islander people.”

Since 1997, the Lowitja Institute and its predecessor CRCs have led a substantial reform agenda in Aboriginal and Torres Strait Islander health research by working with communities, researchers and policymakers. In partnership with 21 participants, the CRC is poised to make a substantial contribution to the goals for 2030 and towards a 2040 that sees Aboriginal and Torres Strait Islander participation and leadership in all walks of Australian life.

— Clare Pain

lowitja.org.au

Feature image: Smoking ceremony conducted by Wurunjeri Elder Aunty Joy Wandin Murphy at the Lowitja Institute CRC launch in October 2014.

 

Multi-million-dollar deal brings UQ pain drug closer to reality

A chronic pain treatment discovered at The University of Queensland is a step closer to clinical use, with a global pharmaceutical giant acquiring the Australian-founded company developing the drug.

Spinifex Pharmaceuticals has been acquired by Novartis International AG for an upfront cash payment of $US200 million (about $A260 million), plus undisclosed clinical development and regulatory milestone payments.

Spinifex is a biopharmaceutical company founded by UQ commercialisation arm UniQuest.

UQ Vice-Chancellor and President Professor Peter Høj welcomed the acquisition and congratulated those involved.

“This is one of the largest Australian biotech deals in history, and is a stunning outcome for the company, the researchers and the investors,” Professor Høj said.

“Spinifex builds on the unprecedented commercial translation achievements of UQ, which includes the world’s first cancer vaccine, Gardasil.

“It is a shining example of UQ’s determination to take research from excellence to what I call ‘excellence plus’, developing a product that has potential to improve the lives of people around the world.”

Spinifex is developing the drug candidate EMA401, an oral treatment for chronic pain, particularly neuropathic pain (a type of nerve pain), without central nervous system side effects.

The technology is based on a discovery by UQ’s Professor Maree Smith.

Professor Smith said the acquisition brought EMA401 a step closer to the people who needed it most.

“Chronic pain can be a debilitating condition, most commonly associated with cancer chemotherapy, post-herpetic neuralgia (a painful condition that can follow shingles), diabetes, peripheral nerve injury and osteoarthritis.

“It’s wonderful to see this deal eventuate, bringing a much-needed treatment option a little closer to reality for the millions of pain sufferers around the world,” Professor Smith said.

UQ pain researcher Professor Maree Smith
UQ pain researcher Professor Maree Smith

UniQuest CEO Dr Dean Moss said Dr Smith’s work was at the cutting edge of pain research.

“Her achievements and expertise have contributed to the formation of the recently-launched Queensland Emory Drug Discovery Initiative (QEDDI),” Dr Moss said.

QEDDI, a collaboration between UQ and Emory University in the US, will see the development of drugs to combat health issues including cancer, diabetes, inflammatory disorders and infectious diseases.

EMA401 is a novel angiotensin II type 2 (AT2) receptor antagonist being developed as a potential first-in-class oral treatment.

Professor Smith and UQ’s Dr Bruce Wyse’s research identified AT2 receptor antagonists as inhibitors of neuropathic and inflammatory pain in preclinical models.

Spinifex is supported by a syndicate of investors, including UniQuest, NovoVentures (Novo A/S), Canaan Partners, GBS Venture Partners, Brandon Capital Partners and UniSeed (a venture fund operating at the Universities of Melbourne, Queensland and New South Wales).