In recent weeks a project has been established by University of Melbourne marine research led by Dr. Eric Treml and Nyriad, an startup NZ company specialising in GPU software, in optimising code for marine population samples. Nyriad’s main mission however is aimed to resolve one of the biggest and growing issues in computation, the growing gap between data computation and data i/o. The technical solution, led by Alex St. John (whose works provided DirectX, the foundations for the first GPUs, and Google Maps), is to combine compute and i/o on GPUs.
This is, of course, a very simple description of a complex problem in development, and one which has been explored before (for example, see M. Silbersetin, GPUfs: Integrating a File System with GPUs, 2013), but without the technological breakthrough for mass adoption. The organisation itself follows the role of an “agile startup done right”, which I discuss in much more detail on my personal ‘blog (this is a technical ‘blog relating to research computing, not organisational management). Employing a wide-range of young engineers and computer scientists, often sourced the local Waikato University, a notable staff member is Bill Rogers, senior lecturer in Computer Science, from said institution (will this finally be the home of Avoca?).
Whilst at Nyriad, time was spent with Andreas Wicenec, through whom Nyriad has established milestone agreements for the processing of data for the presursor of the Square Kilometre Array. From the University of Melbourne’s point of view – in addition to their ability to assist research projects with GPU programming – two extremely important considerations include the ability of Nyriad’s technologies to replace traditional RAID and storage services with GPUs whilst providing greater resilience, and in terms of operating system any potential GPU extensions to the existing Spartan cluster.