Scientific Scribbles

The voice of UniMelb Science Communication students

The Benefits of Basic Research

Basic research, also known as blue-skies research, is research driven by curiosity rather than economic or technological gain. It has been criticized as being a waste of money by some over the years, to the effect that basic research in the US and the UK is much lower today than it was a few decades ago, and further cuts are being considered, particularly in the UK (I couldn’t find any data on research spending in Australia). The argument was that even if basic research provides some benefits, we should let another country spend all its money because the research will be available to all. I completely disagree with this.

There are two reasons why basic research is a critical part of government spending. The first is cultural. The cultural value of basic research is enormous: in our modern age, we know exactly our place in the universe – the Earth orbits the sun, which is just one star in a spiral arm of a galaxy containing hundreds of billions of stars, which in turn is just one of the billions of galaxies in the visible universe; We know about DNA, fundamental to all life on this planet; We know how the universe started, and how it will end. I’m sure I don’t need to convince you of the value of all this knowledge, but we need to think about science in a cultural light from time to time, lest we forget that science has worth in and of itself, quite apart from any technological or economic advancement it enables.

The economic benefits are most definitely there nonetheless. Almost every major technological innovation has been predicated on a piece of basic research, pursued for its own sake. From GPS and the internet in modern years (from General relativity and CERN, respectively) back through transistors (quantum mechanics of solids) all the way to the discovery of electricity and electromagnetic waves, basic research has underpinned every new technology for the past hundred years or more. Even fields as esoteric as Number Theory (it’s a field of pure maths that seemed about as far from any application as it is possible to be) have produced huge tangible benefits, in this case modern cryptography.

The other part of the economic argument is that many of the individuals involved in basic research later go on to work in industry, where their experience with high-tech basic research is invaluable. Studies in the US have shown that companies founded by ex-researchers grow more quickly and are more likely to successfully float than the average US company. As such, the benefits from basic research apply most to the country that pays for it, even though the results are all public. One can also look at the correlation between investment in basic research and economic success, though of course correlation does not necessarily imply causation. The clearest example here is the UK, which has been steadily cutting funding for basic research over the past 20 years, and whose economy has been in steady decline for over a decade.

The returns from basic research may not be immediate, but the benefits it provides are undeniable, and the fact that governments are using the economic situation as an excuse to cut funding to universities is an outrage. From both a cultural and an economic standpoint, basic research is one of the best investments a government can make.

P.S. As I said at the start, I don’t know what the state of funding is like in Australia. On the one hand, the University of Melbourne cut back on a lot of staff about a year ago; on the other hand the “Schools of Excellence” seem to be fairly well funded. If anyone has actual numbers about the funding of basic research in Australia please post them in the comments.

For people who want more specific instances of technological spinoffs and a good discussion about this issue in general, I found this article put out by CERN in 2008 to be very good:

Are you talking my language(s)? Pt. 2

Recently I discovered a whole new raft of programming languages which are called esoteric programming languages, which are created not with acheiving form or functionality in mind, but simply for the sake of having a good old laugh. Here are few of my favourites:

Photo: Global Nerdy (
Photo: Global Nerdy (

I’m programming. lol
LOLcode is a language which resembles the style of writing present in the Internet LOLcats meme. It is typed ENTIRELY IN CAPITAL LETTERS, and with deliberate mizpellingz(sic) and errors what are grammatical-like(sic). LOLcode programs regularly include phrases like:
and will exit the program using the phrase “KTHXBAI”.

More programming than you can shake a ‘speare at.
mimics the writing of the famous English playwright William Shakespeare. Data is introduced and deleted in the form of stage directions to actors: e.g.: “[Enter ROMEO stage right]”, “[HAMLET exit stage left]”.
Data is transferred between ‘characters’ using (often insult-ridden) Shakespearean dialogue and output on the computer screen using phrases such as “speak your mind!”.e.g.:

William Shakespeare. Photo: wikimedia commons
William Shakespeare. Photo: wikimedia commons

“Scene I: Romeo and Juliet’s conversation.
[Enter Romeo and Juliet]Romeo:
Speak your mind. You are as worried as the sum of yourself and the difference between my small smooth hamster and my nose. Speak your mind!
Juliet: Speak YOUR mind! You are as bad as Hamlet! You are as small as the difference between the square of the difference between my little pony and your big hairy hound and the cube of your sorry little codpiece.
Speak your mind!
[Exit Romeo]”

Time on our hands.
INTERCAL, which is a simplified version of the full name: “Compiler Language With No Pronounceable Acronym”, is another classic example of what can happen when programmers are left with too much spare time. For example, in INTERCAL, the programmer must regularly include the keyword “PLEASE” in their program; lest it be judged to be “too impolite”, causing the program to crash or simply not run. The real beauty of INTERCAL, however, is in the reference manual – the book written to help you understand how to program in a particular language. Whereas most programming reference manuals will have additional information stored in an ‘Appendix’ section, the INTERCAL manual contains a ‘Tonsil’. This is explained by the author:

“Since all other reference manuals have Appendices, it was decided that the INTERCAL manual should contain some other type of removable organ.” – INTERCAL reference manual

The horror; the horror… 
But undoubtedly the WORST programming language BY FAR which I have ever encountered is the aptly named: Brainf_ck. Brainf_ck has only eight – count ‘em: EIGHT – commands: < > + – , . [ ] (AND THAT’S IT!) To give an example of how truly horrendous and unreadable Brainf_ck programs are, here is an example of a program which prints the text “Hello, World!” to the screen (courtesy of Wikipedia). Rest assured, you don’t need to understand any programming to fully appreciate how nasty this is:

+++++ +++++             initialize counter (cell #0) to 10

[                       use loop to set the next four cells to 70/100/30/10

    > +++++ ++              add  7 to cell #1

    > +++++ +++++           add 10 to cell #2

    > +++                   add  3 to cell #3

    > +                     add  1 to cell #4

    <<<< –                  decrement counter (cell #0)


> ++ .                  print ‘H’

> + .                   print ‘e’

+++++ ++ .              print ‘l’

.                       print ‘l’

+++ .                   print ‘o’

> ++ .                  print ‘ ‘

<< +++++ +++++ +++++ .  print ‘W’

> .                     print ‘o’

+++ .                   print ‘r’

—– – .               print ‘l’

—– — .             print ‘d’

> + .                   print ‘!’

> .                                            print ‘\n’


Thankfully, these programming languages are the exception, rather than the rule, when it comes to ‘making electrons dance’ inside your beloved PC or Mac. Most languages like Python and C++ are quite sensibly designed, and considerably more user-friendly than their esoteric counterparts!

Therefore I encourage everyone to do a little bit of research into a language like Python or C++, and have a good ol’ tinker. Programming can be a lot of fun and will let you create all manner of useful and fun applications to use, or to share with friends. What’s more, it will also teach you a lot of stuff you probably didn’t know about your computer; which may even save you from wanting to go all ‘percussive maintenance’ on it. 🙂

Happy coding!

Are you talking my language(s)? Pt. 1

A man dressed as a character from 'Tron' - an 80's movie about computer programmers.
A man dressed as a character from 'Tron' - an 80's movie about hardcore computer programmers. Photo: wikimedia commons

In my studies as a budding electrical engineer, I have been exposed to a small but varied range of computer programming languages. Some have been taught to me as part of my course, others I have attempted to teach myself – largely for my own sick, nerdy enjoyment. All of which I love. <3

Talking ‘turkey’ to the transistors.
For those of you wondering exactly what a computer programming language is… I must confess; my best answer to this question will, at first, appear less than earth-shattering. Simply put, computer programming languages are the languages which you can use to ‘talk’ to computers to tell them what you want them to do.

Whilst this answer may seem a bit of a cop-out, the reality is that programming languages are not all that different when compared with any other language. All programming languages contain, in some form or another:
Nouns – which are used for naming various pieces of information or data.
– which tell the computer what action to perform at a given time.

These elements need to be combined according to the rules of the syntax of the language; that is: where to use your commas, semi-colons and full stops! Many languages also allow you to use other, crazy-yet-fun things like abstract nouns, adverbs, and adjectives to further describe which tasks the computer should perform and how, but we’ll leave them alone for the time being.

Horses for courses.
Now programming languages come in many various shapes and sizes, and all have their own strengths and weaknesses – which ultimately define the situations you might use a certain language in. There are languages which are designed for creating complex simulations of physical situations, for creating graphical displays, or for hefty mathematical number-crunching. Some languages are used for writing web pages and online applications, whilst others are used for writing ‘system-level’ code, which lives deep within the bowels of the machine, working its little guts out to ensure you never happen to meet the dreaded Blue Screen of Death.

The infamous 'Blue Screen of Death'. This will appear when your computer experiences a serious error.
The infamous 'Blue Screen of Death'. This may appear if your computer experiences a serious error. Photo: wikimedia commons.

Typing in tongues.
As I said, I’ve had some (often fleeting) experience with a variety of languages, so I’ll briefly explain what each one is, to act as an introduction to what languages are out there.

  • MATLAB(short for MATrix LABoratory) is a language which can be used for mathematical calculations and simulations (amongst other things).
  • Python (named after Monty Python) is a nice ‘all-rounder’ language which can do just about anything you ask of it. (We used it in 800-204 for text processing.)
  • C and its cousin C++ are a couple of those lovely ‘system-level’ languages which trudge through the dense jungle of your machine’s inner workings. Large portions of the most operating systems like Windows and Linux are written in C or C++.
  • Prologis useless; forget I even mentioned it. (Just kidding – Prolog is based on a philosophical reasoning technique called propositional logic, which allows you to construct complex arguments for testing them against a set of ‘rules’.)
  • LabVIEW (Laboratory Virtual Instruments Engineering Workbench) is one of the new additions to the family. It allows you to create your own custom virtual electrical hardware with clickable buttons, dials and switches, so you can test electrical equipment before you even build it.
  • MIPS32 is what’s called an assembly language. Assembly languages let you crawl about as deep within the machine’s brain as you can get – or should ever want to go. Get much deeper, and you’ll be pushing electrons around.

…but this is barely the tip of the iceberg. In my next post, I’ll tell you about a few of the weirder languages that are REALLY ‘out there’. Stay tuned!


Finding the One

“ The ancient sanskrit legends speak of a destined love, a karmic connection between souls that are fated to meet and collide and enrapture one another.

The legends say that the loved one is instantly recognised because she’s loved in every gesture, every expression of thought, every movement, every sound, and every mood that prays in her eyes

(They) say that we know her by her wings — the wings that only we can see — and because wanting her kills every other desire of love’

– Shantaram –

I remember a very interesting, slightly controversial, conversation I had with a good friend of mine several months back. We were sitting in a little alleyway having ramen on Bourke Street discussing whether, practically, there can exist such a concept as ‘the one’. ‘The One’ referring to that person for whom you are the perfect match, and who by the collusion of fate, serendipity and the stars, you are destined to meet, and to love forever more.

My friend, being of the more practical sort, was (and still is) of the opinion that there is no such thing. He argued that we make the best out of the circumstances we find ourselves in and that there is no ‘one’ perfect match because we are flexible in being able to make do with whatever and whoever we have.

I argued otherwise.

Firstly, this is because I am a hopelessly hopeless romantic, moulded by a steady diet of romantic comedies and tear-jerkers such as The Notebook. Secondly, I am female and therefore am genetically programmed to be unpractical. The final and most important reason for my unbudging belief in this concept of ‘the one’ stems from the fact that I am, at heart a die-hard immunologist #.

Let me explain.

Our bodies are patrolled by a number of cells called lymphocytes, a subset of White Blood Cells. There are two types of lymphocytes, B cells and T cells.

These cells are especially important components of our immune system in that they recognize foreign substances that do not (normally) belong in our body, and then mount an attack against them in order to eliminate these substances from our body. In other words, they seek, destroy and kill foreign invaders which typically, tend to be organisms such as bacteria and viruses.

Both B cells and T cells have Receptors on their surfaces.

Any molecule that can be bound by these receptors and stimulate an immune response, is known as an antigen. When this receptor binds to an antigen, the cell – be it a B cell or a T cell – is essentially triggered into action.

If this is a receptor belonging to a B cell, the B cell will then start to make many, many clones of its receptor which it then releases, or secretes into its surroundings. These free-floating clones of the original receptor are known as antibodies.  They have exactly identical binding sites as the original receptor and from there on end, recognize only the antigen recognized by the original receptor.



If this is a receptor belonging to a T cell, it remains bound to the cell –surface of the T cell and does not get secreted. Instead, the T cell, depending on the type of T cell it is, will either begin to produce certain molecules known as cytokines that guide the immune reaction onwards, or toxic materials that result in the destruction and killing of the foreign invader.

In most if not all cases, a receptor is always specific for only one antigen. Each lymphocyte carries thousands of receptors but these receptors are all of one specificity (that is, they all recognize the same thing).

T cells are slightly different to B cells in that they require the antigens they recognize, to be presented to them carried by what are known as MHC molecules on cell surfaces. These MHC molecules act as signposts, and will alert and activate T cells when foreign substances are present.

photo-8T cell Receptor Recognizing Antigen being presented by a MHC molecule
B Cell

B cell receptor recognizing an Antigen

If you think about all the different substances and microbes that we are exposed to every day, there is an enormous amount of potential antigens out there that our body needs to somehow be able to recognize and combat. In order to do this, our bodies also generate a huge repertoire(specifically 10 ^11) of B cell and T cell receptor specificities.

It has been described to me that the chances of one of these receptors encountering an antigen it is specific for, is a bit like winning Tattersalls. ( I can relate to this.)

However, with our own immune system rapidly generating new types of receptors with different specificities all the time, their chances are considerably higher than ours are of winning the lottery.

What we can take away from all of this is that  every lymphocyte has its own specific antigen that it is destined to some day meet and mount an immune response against. This encounter requires a certain degree of chance and coincidence, in that both antigen and lymphocyte must be in the same place at the same time.

Nevertheless, there is an antigen out there, some where in the universe, for this lymphocyte. *

It’s simply a matter of sampling until it finds the right one.

# ( as opposed to a realist, or an idealist, or any other –ist )

* Corny,.. I know.


 A friend of mine recently passed on the summer 2010 issue of a magazine called Adbusters to me. While procrastinating, I decided to flip through it. I came across this one article and found it extremely engaging and interesting. It is about the ecology of mind.

“For thousands of generations we humans grew up in nature. Our teachers were flora and fauna and our textbooks thunderstorms and stars in the night sky. Our minds were like the forests, oases and deltas around which our cultures germinated: chaotic, wild, fecund.” – Kalle Lasn and Micah White

But in the last few generations, we have abandoned the natural world and absorbed ourselves in virtual realms.  By relocating from nature we have essentially altered the context in which we live our lives.

Just as we see the transition to a new psychic realm, we also see exponential growth of mental illnesses. UN predicted that by 2020 mental diseases (anxieties, mood disorders and depression) will be greater than heart disease.

So why exactly are we breaking down mentally? Psychologists would suggest: breakdown of community, insecurity of social roles, stressed of modernity and globalization and may be even chemicals in the air, water and food that may be affecting our brains in unknown ways. Or should we blame the 1000s of aggressive marketing messages our brain absorbs every day, or the heavy use of internet which lead to addiction and depression and that the digital revolution is rewiring our brains in unhealthy ways.

Eco v/s Psycho

It is interesting to think that as we pollute the earth, we simultaneously pollute our brains. I have summarized the factors affecting our mind and I hope people will reflect on it.


For countless generations the ambient noise was rain and people walking. Now three generations have become stimulation-addicted. Can’t work without background music, can’t jog without earphones, can’t sleep without an iPhone tucked under the pillow.

‘Silence may be to a healthy mind what clean air and water are to a healthy body.’



Every day, an estimated 12 billion display ads, 4 billion radio commercials, more than 200,000 TV commercials and an unknown number of online ads and spam emails are dumped into our collective unconscious.



The first time we saw a starving child on a TV ad, maybe we sent some money. But as these images become more and more familiar, our compassion faded. Eventually such ads start to annoy us, even repulse us. And now we feel nothing when we see another starving kid!

“The commercial media are to the mental environment what factories are to the physical environment. A factory dumps pollution into the water or air because that’s the most efficient way to produce plastic or wood or steel. A TV station or website pollutes the cultural environment because that’s the most efficient way to produce audiences. It pays to pollute. “



At first all that information was pleasant. It felt as if the sum of all knowledge was only a hyperlink away and we skipped joyously down the infotrail, sending emails, adding bookmarks and hopping from site to site late into the night. But as the initial glow wore off, we were left in a state of digital daze: unable to concentrate, feeling foggy, anxious and fatigued.

Our smart phones, notebooks and computers now keep us online constantly.  While waiting in line at the supermarket or enjoying an evening walk or reading a book, we keep texting our friends and receiving quick Twitter updates. We are drowning in an endless stream of connectivity.  And future generations may be even more wired.

Our online lives may now be impairing our ability to follow a sustained line of thought, to think deeply about something.

May be we are suffering from the info-disease that Nicholas Carr first diagnosed in himself. And I will conclude here with my favorite part of the article:

Over the past few years, I’ve had an uncomfortable sense that someone or something has been tinkering in my brain, remapping the neural circuitry, reprogramming the memory. What the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving streams of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski”

Number of posts found: 2504