Researcher Application Software Service

"Work smarter, let the machines work harder"

2013 3D Printing Showcase

In an Australian first, the University of Melbourne led by ITS Research, played host to the 2013 3D Printing Showcase,  an event designed to bring the researchers, students and commercial entities active in 3d printing together in a single place. It was held in the School of Engineering’s Student Lounge and attracted 30 exhibitors and nearly 600 attendees.

3d prints from the Architecture department
3d prints from Architecture, Building and Planning

A range of 3D Printers were on display, including a new machine shipped in especially from Sydney that creates full-colour 3d models from sheets of standard recyclable paper.  A large range of Fused Deposition Modelling printers were up and running with hundreds of fantastic printed parts on display.  Many lucky punters were even allowed to take some of these curiosities home.  Most of the printers used plastics like ABS (the stuff they make Lego from) and PLA (a biodegradable plastic), but there were models made from paper, nylon, wood and even titanium!  The titanium printers apparently were too big to bring in, but the parts they can make are stunning, both in strength and complexity.

Titanium prints from Monash
Titanium prints from Monash

A highlight of the day was when showcase partner Quantum Victoria donated hand-built 3d printers to 10 disadvantaged schools, presented by Vice Chancellor Professor Glyn Davis.  The team from Quantum and some staff from the University of Melbourne worked alongside teachers from these schools to build these printers.  This provides invaluable insight into the workings of these devices and the schools actually had a week to play with them before the showcase.  On the day, the teachers and their principals attended and showed off what they had done in the short time they had.  Several spoke to the media about the experience and were unanimous in their praise for the initiative.

Professor Glyn Davis (UoM), Soula Bennett (Quantum Victoria) and principals from the high schools
Professor Glyn Davis (UoM), Soula Bennett (Quantum Victoria) and principals from the high schools

Research featured strongly in the exhibition with displays from RMIT, Monash, Swinburne and La Trobe, alongside several displays from the University of Melbourne, including a 3D Printed Replica of Ned Kelly’s death mask.  The potential of 3D Printing in research, both as a focus and as an enabling tool, is only just becoming clear.  There were 3d printed robots, aeroplane parts, brains printed from MRI scans, a dog spine and even an articulated prosthetic hand.  It is clear that we have only scratched the surface and no doubt there will be a much bigger research presence next year.

People looking at parts
People looking at 3D printed parts

Complimenting the exhibition were a series of lightning talks ranging from the social impacts of 3D printing to the industrial and biomedical applications of the technology.  Professor of Architecture Glenn Katz from Stanford University presented one of the talks live streamed from California.  The underlying message from these talks and from the exhibitors themselves is that this technology will fundamentally change the research landscape, a change that is already apparent in many areas.

Professor Marcus Wigan (Swinburne University)Professor Marcus Wigan (Swinburne University)

Professor Andrea O'Conner (UoM)Professor Andrea O’Conner (UoM)

Professor Glenn Katz (Stanford University)Professor Glenn Katz (Stanford University)

Almost all research institutions and hardware and software vendors engaged in digital fabrication in Victoria were represented at the showcase, providing excellent opportunities for engagement and collaboration.

Professor Glyn Davis (UoM)
Professor Glyn Davis (UoM)

Media interest in the event was much higher than expected, with Professor Glyn Davis and the showcase organising staff interviewed several times for various print, radio and TV news media.  These include The Age Online, SBS’s The Feed and several ABC programs including 774 Evenings with Lindy Burns, The World Today with Rachel Brown, 702 Drive with Ellen Fanning and 774 Afternoons with Richard Stubbs.  International 3D Printing website 3Ders also featured the story.

Watch the video: 3D Printing Showcase, 2013

High Performance Computing takes yet another step!

Over 200 researchers and research students from the University of Melbourne use the edward high-performance computing service every day. In this second blog post we continue to explore how they’re using edward, and explain how researchers could benefit from this supported, free and easy-to-use  service.

Today, most research involves the use of computers in some way. When a researcher’s computational needs exceed the capacity of their desktop system, many turn to High Performance Computing (HPC) systems. While huge HPC systems, sometimes called supercomputers, make headlines, even smaller systems can have a huge impact. Information Technology Services (ITS) operates an entry-level HPC system, called edward, which is freely available to all researchers at the University. Here we want to share more amazing examples of  ‘Researchers doing great things’.

Andrew from the Department of Psychiatry

Andrew has been studying brain connectivity in patients with mental illness. In particular, tracing nerve fiber bundles in the human brain by applying tracking algorithms to an individual’s MRI data. The set of all fiber bundles can then be used to map the network of connections that interconnect different parts of the brain. The ultimate goal is to identify connections that are disrupted in clinical cohorts. Here we show some results from a recently published article WHICH could be of interest to some our students at UoM.

The figure on the left depicts nerve fibers in the human brain that show evidence of damage in long term, heavy cannabis users. The fiber tracking algorithm that was used to trace out the nerve fibers in each study was run on edward.

The figure on the right shows a network schematic of brain circuits that are disrupted in children with early onset schizophrenia.

Andrew says “HPC has allowed me to map the human connectome in a number of individuals in a much shorter computation time than what would be required with serial processing.”



Paul & Ken from the School of Physics

scattering cross-sectionPaul and Ken are part of an international collaboration that does theoretical studies of nuclear reactions responsible for the generation and abundance of chemical elements seen in the universe, as well as reactions involving elements that are highly unstable, thus testing out knowledge of nuclear matter.

Paul: “HPC has given us access to the cluster edward, allowing us to run many jobs with different parameters describing the nuclear interaction simultaneously. This means that characterising the forces involved in out calculations can occur much faster than if jobs were run end-to-end on standard computing equipment.”

Ken: “HPC has provided the computing resources that all of the colleagues in the collaboration can access  to implement the basic codes to solve the mathematical equations (the Lippmann-Schwinger  coupled channel equations) central to spectral evaluations.”

“We now can evaluate low excitation spectra of nuclei including resonance properties and also the cross sections for scattering of two nuclei.  The cross sections so far attained match well recent experimental data from low energy collisions such as that of of exotic oxygen-14 nuclear  ions scattering from hydrogen targets.”

The figure on the right shows how the cross-section varies as a single parameter is modified. The peaks correspond to resonances in the nuclear spectrum.

Mani from Computing and Information Systems

In this study, Mani investigated two of the common parallel and distributed programming techniques to construct a parallel and distributed CoXCS model. CoXCS is a multi-population learning classifier that employs co-evolutionary learning mechanism to evolve several sub-populations together. CoXCS combines cooperative co-evolutionary algorithms with ensemble learning techniques to build a multi-population ensemble classifier to solve large-scale and complex classification problems. The nature of the CoXCS architecture allows for several different parallel and distributed programming techniques to be used. Basically, CoXCS is a federation of multiple XCS-Cores.

Mani: “Therefore we can use parallel processing architectures based on shared-memory to speedup each XCS-Core individually, or use distributed-memory  architectures to run multiple XCS-Cores in parallel. Alternatively, we can combine both models and propose a hybrid parallel and distributed CoXCS model. We also investigated using CUDA library to build and GPU accelerated CoXCS model and Cloud CoXCS as well. The simulation experiments of strong scaling analysis demonstrates the efficiency of 60% to 80% on MPI and OpenMP architecture. Our CUDA enabled CoXCS can achieve 12X speedup on average.”

ITS Research would like to thank Andrew, Paul, Ken and Mani.

The dream of an app store for research: The coming “app-stack store” and the future of how to provide tools to researchers

What should we all be talking about at the eResearch 2013 Australasia conference / #eRes2013 next week? Perhaps the conversation should be about what will fundamentally change the way researchers do their research over the next year?

This post is a sneak peek of the latest feature that is about to be floated out on the NeCTAR Research Cloud (powered by OpenStack) at the start of 2014. If you are around at #eRes2013 then you should come have a chat about it in the dev lounge. This new feature will be for developers, managers, and most importantly, researchers. It will change the way in which researchers use their day-to-day computational tools.

Why? Because it is going to enable research tools to be shared in a way that has never happened, making it simple to reuse software, workflows and data. Over the next year, this feature will enable the conversation about who is going to share what tools with who and how. Most importantly though, we’ll be able to start talking about the best practice in sharing those tools, and how to best bring value to Australian researchers.

What is this feature, and why is it going to change everything?

Some will call this feature a ‘research app store’ or ‘research marketplace’ (some will even call it a research bazaar) – but don’t be fooled, this is so much more powerful than the apps you have on your iPhone. This feature is not about technology, but it’s about:

  1. Lowering the barrier for a researcher to take their bespoke or generic computational tool out of their personal toolbox and magically give it to anyone else in the world;
  2. Lowering the barrier for a researcher to use someone else’s bespoke or more generic discipline tool, without having to compile, configure, debug or deploy anything.

However, for our community who provide IT to researchers to succeed in this dream, we must start to develop our tools via a common framework (called Heat). Here is a video that explains the framework, why we need to work this way and how this will help us realise the dream of the the app store (or rather ‘stack stores’) that researchers will use in future.

Flanders on Heat 

The video is a high level “ten-thousand-metre and above” view of what we are proposing when we say that the community must start to make their research tools ‘stack-store’ compliant.

We’ll be demonstrating Heat at #eRes2013, and it will be released on the Research Cloud in early 2014, enabling anyone to build their own app store for their research communities.


Cloud vs HPC, Scalability vs Performance

The NeCTAR Research Cloud has now been in production for 18 months, with 3,100 users from nearly 70 different research organisations around Australia logging in. We’ve seen a huge variety of use cases, from blogs, web servers, database services, genome analysis, high-performance computing clusters (HPC) and even the the searchable history of gigs in Melbourne.

A question often asked by users is ‘should I be running on HPC or Cloud’? A question asked by people who are building clouds is ‘how fast should I make the disk?’. A question by the HPC aficionado is ‘why would I bother with cloud?’. To answer this we need to appreciate difference in ethos between cloud and HPC. Without understanding the ethos, we end up with misconceptions about the relationship between cloud and HPC, who’s married to who, and who said what about who and when (my disk is faster than your disk, I’ve got more CPUs than you, and other schoolyard jibes).

A way to talk about the difference is to think about the cloud ethos of scalability and the HPC ethos of performance.

When we talk about performance, we are referring to a desired amount of computational throughput given a set amount of time and resources. Individual components are relatively high performing, it’s the dark art of HPC to squeeze as many FLOPS (CPU performance) and IOPS (disk performance) out of a HPC system as possible. As a result, the performance of the aggregate system is high (high bandwidth interconnects, high disk throughput), but it generally costs a hell of a lot of money. When you want to get something done, you ask for a set amount of resource for a set amount of time. Same service for everyone. Same CPUs, same disk. One item on the menu. And an upper limit that will take another 3 or 4 years to change.

On the other hand, scalability is really about how can a system grow to meet a need. Individual components are relatively low performing. Need more performance, well you just add more instances. By using truly commodity components, you hit a power price-performance point that is more optimal than HPC.

Where cloud truly differentiates itself from the HPC cousin is that cloud is an Infrastructure as a Service (IaaS). That means that I can pick and choose what ‘unit’ of compute I want, what performance I want, and how many of them I want. The menu is rich and delicious and full of promise.

If I want to add more of them, I simply ask for more. Because everything can be treated as a service (I mean, everything, even the sort of disk you’re accessing), I can ask for a small virtual machine attached to high-speed storage (for high-throughput disk speeds HPC users dream about), or a large virtual machine attached to desktop-quality storage (what’s I’m experiencing right now on my Windows desktop). There are many items on the menu, and the flexibility to expand the what I need as I need it.

Both characters are important, but don’t be fooled into thinking that cloud is lowly cousin of HPC. Scalability is a different beast from performance, and it lets you do sophisticated things you could never dream of with HPC. All you have to do is choose the menu that is right for you.

Emerging Technology morning teas

A few months ago, ITS Research began hosting the Emerging Technology morning teas.  This is a forum for people interested in technology and gadgets to get together and have a play with new gear.  Each month we have three technologies on show and someone to talk about them.  Rather than a typical presentation style approach, we like to keep things informal, with talks lasting only 10-15 minutes and plenty of question time.  More importantly, we encourage attendees to have a play with the equipment to see how it works for themselves.  While we have a research emphasis on the technologies we look at, some are chosen simply because they are fun or just plain cool.

Ben talking about AvayaLive Engage
Ben talking about AvayaLive Engage

The first morning tea…

In the first morning tea, Ben Loveridge showed off AvayaLive Engage, a very interesting immersive online collaboration tool, somewhat akin to Second Life.  In this virtual space, avatars interact and converse using realtime audio and/or video conferencing, and can share presentations and other digital media.  You can read more about AvayaLive Engage in an earlier post by Ben (

Ben Kreunen took the group through some of the challenges of scanning physical objects to produce 3D models.  The University Digitisation Centre has recently purchased a NextEngine 3D scanner and has been testing it.  Ben compared the various scanners available in the facility and discussed the benefits and pitfalls of each.  You can find out more at

Mike Wang brought in the extremely cool Oculus Rift virtual reality headset.  This headset provides stereo immersive video, which is greatly enhanced by the realtime head tracking feature.  This allows the wearer to be immersed in a virtual environment and look around simply by turning their head.  Obviously the gaming market is the first to jump on board, but we are already thinking about what sort of cutting edge research we could do with something like this.

The second morning tea…

Ninja blocks image
image source:

Suman Adirala kicked off the second morning tea with an introduction to Ninja Blocks.  Not just a really cool name, these cute little devices are amazingly useful.  Billed as a simply way to connect to the “Internet of Things”, the ninja blocks provides a cloud enabled device that receives information from sensors, such as temperature, humidity or movement, and engages actuators, such as remote controlled powerpoints.  For example, you could use a movement sensor to monitor your front door and turn on the TV or lights if someone approaches when you are away for the weekend.  Or you can check the status of a powerpoint and turn off the iron if you left it on.

Danial Fraser and his colleagues where also fascinated by the Oculus Rift and decided to build their own.  They showed off a remarkable device (in a 3d printed case no less) that is starting to seriously rival the Rift, and may even surpass it if they get their way.  They also had a Rift on display for comparison.  They built their DIY Rift themselves as part of an Engineering Masters project, and it just goes to show that Melbourne Uni is producing some quality graduates.

The third morning tea…

The third and most recent morning tea started with Ben Tam from Qubic, our first non-university presenter, talking about the 3d scanning options his company provides.  Ben demonstrated the high-end structured light scanner which produced excellent quality models and was clearly easy to operate.  He also showed off some new software that works with a Microsoft Kinect device and scanned one of the attendees on the spot.  While not as good as the high-end scanner, the result was still very impressive.

Tommy Carron brought along a Leap Motion, a tiny USB device that provides a contactless computer interface.  Working much like a Kinect, the Leap Motion boast far higher resolution, though in a far more confined space.  Tommy had a couple of sample applications that worked very well, but he also pointed out that the technology is so new, that the real niche for the device probably hasn’t been identified yet.  Still, it is certainly fun to play with.

Lastly we had Tim Wrigley talking about a consumer grade Augmented Reality headset, Epson’s Moverio, that he purchased via Ebay.  This device gives the wearer a distinctly geeky look (reminiscent of Star Trek) but does provide a good quality augmented reality experience.  Tim pointed out that the technology still hasn’t really taken off, but there is current research investigating how to integrate a depth camera, like a Kinect, into the design, to allow the wearer to interact with the device hands free.  Google Glass, anyone?  Tim is also looking at ultra portable computers to drive the device.

So that is sort of thing we have covered so far.  We are always open to suggestions as to what tech we should be looking at, so feel free to sign up to the emtech-info or dfab-info mailing lists to be kept up to date (visit sign up).  Hopefully we will see more of you a the next gather.  Did I mention we provide the morning tea?

Tommy Carron and the Leap Motion
Tommy Carron and the Leap Motion

Group shot with Ben Tam

Tim Wrigley with his augmented reality glasses
Tim Wrigley with his augmented reality glasses

LUV and 3d printing

On the 6th of August, I had the pleasure of giving a talk at the Linux Users of Victoria (LUV) meeting.  With my trusty Up Plus printer in tow, I presented to an audience of Linux enthusiasts, most of whom had heard of 3d printing, but few had seen a device in action.  I brought along several samples showing the capabilities of the fused deposition modelling printers, as well as the new Objet printer.  I fired up my standard presentation but quickly found that while the audience may not have seen 3d printing before, they certainly knew a lot about it.  With questions coming thick and fast, I was certainly kept on my toes trying to answer the curliest of questions.  Looking back, it was less of a presentation and more of a group discussion about the technology and the future it promises.

Bernard Meade at Linux Users of Victoria
[Image courtesy of Wen Lin]
Technology interest groups like LUV are a fertile ground for imagining the future of all aspects of technology, not just their own particular sphere of interest. When presenting to kids at schools, I’ve found a similar unfettered enthusiasm for exploring the possibilities, and it can be easy to fall into the trap of thinking that it is kids who will come up with the best ideas.  Indeed, we should be actively promoting the tech to kids, whose minds are malleable, but there are many adults with that same wide-eyed wonder, with ready imaginations that can push the boundaries every bit as hard as a child that doesn’t yet recognise those boundaries even exist.  Sure, I have encountered the flip side of the coin, where on more than one occasion I have had an adult declare that they would rather not live in the world that this technology represents, but that is just fear of the unknown talking.

It is immensely satisfying to have the opportunity to bring this sort of emerging technology to people who are open to hearing about it.  No matter the age, it is priceless to see the moment when someone “gets it”, when the potential of 3d printing dawns on them.  It’s times like these when I think I have the best job in the world.

SmartBar 2 May 2013 – Image by Tim Marchinton

High Performance Computing (HPC) – Delivering for Researchers!

Over 200 researchers and research students from the University of Melbourne use the Edward high-performance computing service every day. In this first of a series of blog posts we explore how they’re using Edward, and explain how researchers could benefit from this supported, free and easy-to-use  service.

Today, most research involves the use of computers in some way. When a researcher’s computational needs exceed the capacity of their desktop system, many turn to High Performance Computing systems. While huge HPC systems, sometimes called supercomputers, make headlines, even smaller systems can have a huge impact. Information Technology Services (ITS) operates an entry-level HPC system, called Edward, which is freely available to all researchers at the University. Here we want to share with you some examples of  ‘Researchers doing great things’.

Continue reading “High Performance Computing (HPC) – Delivering for Researchers!”

Why publish data?

We get asked a lot about what the benefits are for sharing data. In the past, these benefits have often been couched in terms of the benefits for the consumer or user of the research data- and if they are often quite obvious. Who wouldn’t consider utilising well-documented and accessible data collections relevant to their field of research?

Continue reading “Why publish data?”

Why must the University of Melbourne care about open data?

GovHack 2013, where data enthusiasts mined government-funded data released under open source principles, was a great success. See our blog post of 5 June 2013 to learn more about GovHack.

Last year’s GovHack was held in Canberra and Sydney, and attended by approximately 160 people. This year’s event was held in eight cities around Australia, including Melbourne, and was attended by 900 or so keen data users.

Next year’s event is going to be bigger.

Continue reading “Why must the University of Melbourne care about open data?”

Australia’s largest hackathon happened last weekend, and The University of Melbourne was there. Why?

Over the past weekend in eight cities around Australia, data enthusiasts mined government-funded data released under open source principles to create the best new mashups, data visualisations and real-world data applications.

They were competing in GovHack – a frenzied 48-hour hack-a-thon* that required them to form teams, analyse data and create outputs.

The event took place in eight cities and was attended by ~900 data enthusiasts. Nine hundred nerds hacking for 48 hours? That’s 43,200 hours, 5,400 eight-hour days, or 1,080 weeks. They vied for more than $50,000 worth of prizes ranging in value from $500 to $5000. SBS World News Australia covered the event – check it out (the GovHack story starts at 17:23).

Continue reading “Australia’s largest hackathon happened last weekend, and The University of Melbourne was there. Why?”

Number of posts found: 73