Garo Garabedyan's Divergent Thinking Blog

MetaOCaml-Resource Aware Programming

with one comment

Walid Taha, Resource Aware Programming, Google Tech Talks, January 22, 2007 is a remarkable video in YouTube an hour and 10 minutes in length about MetaOCaml, which let me touch a dream of a language I was dreaming for flexible abstraction design in order to control all the resources (including computation space and time), and reactive functional programming including separation of stages of computation (multi-stage) which separation enables more efficient calculation on the base how a parameter of calculation is accessible for times ago. Programming in MetaOCaml is in mathematical form by using mostly recursive expressions, the letter expression gives more certainty of the code, after that the code is translated in C and everything now on is clear.

With an aim to promote knowledge on the above, I have made a transcript of the introduction of Taha (~10 min.). I was facing common problem domains in G. Garabedyan, Data Flow Processing, eventBased Algorithms and Data, 2008 and particularly in G. Garabedyan, eventBased Framework, 2009.

Thank you very much, Chris. It is a pleasure to be here today and the goal is to tell you all about the interesting research in programming language that we are doing at Brice and I will avoid getting into technical details and theorems, and formalisms and so on, and focus on using various examples and languages that we have built and intuitions and motivation from the kinds of problems that we would like to see our languages being used to solve.

So, to get started, I would like to say that one of the key things to building better programming languages is design abstractions that would help us build the programs that we want to write. Abstractions like arrays, functions, objects, modules and so on, all of these are examples of things that we have seen before in different application domains like Hi-Performance computing or Databases, for example, where they proven and successful. And there are a lot of standard examples of these abstractions that we know pretty well. In a lot of cases they do a very simple thing which is to make our programs shorter and more manageable. Often, they also make it easier to reuse, so when we have solved something a particular program before, it is easy to reuse the solution to solve the new problem that we have seen and if they are designed carefully they also help verify and reason about the correctness of our programs. And in particular reasoning is something that we are very interested in and when we are able to do that what we get in the end of the day is a guarantee about our program, often through the language that we have created or type system or curricle, get a guarantee about our program and the guarantee says a specific bad thing wouldn’t happen at run time. So you build this program and it looks like it is supposed to be doing a certain thing but also interested in knowing there are other bad things that it is not gonna do. Of course, ultimately, if we take verification all the way when we go the next step and would be sure that it is doing what we wanted to do. But a very practical thing is to have at least a guarantee about certain bad things not happening and here we are interested in classic things like buffer overflows, run time errors and so on. There is very interact interplay between design of abstractions and what we can do in terms of reasoning about our programs afterwords.

Alright, so the last slide is supposed to be a motivation for the design abstraction that we get in high-level programming languages. High-level programming languages to me are things like O’Caml, ML, Haskel and all these very high-level programming languages that we know about. Now if we go to the real world and look at programs that people are writing today we will find that they look like the controllers that run inside our car, all kinds of medical applications, and health services applications, robotics applications, they run very important control software that determines what function the rest of this metal does and the same thing for defense applications. So, what languages do people use to write these pieces of software, these are things like C and C++ for the most part and for very interested in performance there are big computational components of the system you might find that they are written in FORTRAN. 20 or 40 years ago these were high-level languages, today they are not what we will consider high-level languages based on the new innovations that we have seen over the last 20-30 years. So, why is it the case that for a lot of applications in software that has been written today we do not see the high level languages being used. My piece is here that basically the same time that we use this more advanced abstractions we also giving up control over firmly important low-level details. And in a lot of cases especially for embedded and real-time systems, a lot of times the benefits that we get from high-level abstractions but the programmer that is responsible for building the system at the end of the day it can not really give up the control of the low-level resources. Cause the programmer or system designer is the person that responsible for achieving this real time guarantees or various performance characteristics that the system must have. There are a lot of examples of abstractions that we could use to illustrate this point, the simplest one is may be automatic memory management. Automatic memory management allows us to thing of dynamic data structures at a higher level than the model that we allocate data explicitly and deallocated at the end of the day. And we get this huge advantage with dynamic memory management of safety because we have only allocate operations and know explicit the allocations and we do not have to worry about dangling pointers which are big safety problem and languages that do not have dynamic memory management. Get a very nice abstraction from dynamic memory management at the same time this means that we are relinquish control of when things get allocated or also when things get deallocated. In addition, because the system has to do more work for us and we do not necessarily know when this work will be done, we may be doing a critical computation that needs to be done quickly for this particular application and the garbage collector must run and suddenly we have no idea how long it is gonna take for this computation to finish. This is just one example of many instances where we care not only about the functionality of our input and output behavior of our program in terms of a standard input/output behavior, but we care also about the resources that are being consumed, the basic resources being time and space needed for the computations but all kinds of notions of resources that depend on the particular kind of problem that we are solving using the software artifact that we are trying to build. The question that we are interested in What abstractions can be used to allow people to build the programs that they want to build when on the same time get guarantees about resources of these for?

The goal of our research is basically to have be able to produce languages to produce our expressive as possible to the programmer so that we can still structure our problems as clearly as possible and reuse the solutions that we have before, the same time just side by side to this explicitness we want to have guarantees that are stronger as possible about the programs that we write. Essentially not let the mechanisms for let guarantees and better properties for our programs getting the way to be able to write programs and in a way that it is natural as possible. The way we are gonna do this, there are essentially 3 components to our approach. The first is called multi-stage programming and essentially this is the way to separate the design that is captured by our program from the actual implementation that comes out of the design at the end of the day. And one way to thing of this is that our designs are gonna generate the programs that we need at the end of the day. Another aspect especially important for real time and embedded systems is to incorporate a treatment of reactivity or a mechanism that allows us write programs naturally where this programs are gonna interact with the stream of inputs coming from the outside and producing a stream of outputs back to the environment. And finally, using advanced type system techniques and particular linear types and indexed types, to be able to express much finer property is that in a typical type system or a traditional type system would let us do.

So, that plan for the rest of the talk, we are gonna start with a sort of characteristic of a traditional computing called batch computing. All the inputs are available in one shot or computation for the program that we want to build. And the output comes only after we have all the inputs that are needed. So we are gonna go extend this module in one dimension and here what we have is some of the inputs arriving early and we actually use that as soon as they arrive to use some computation and produce and new specialized computation. The specialized computation can then receive the rest of the input and produce the output more efficiently. And we can even later see how this separation is also useful for getting stronger guarantees about the final phase of the computation. So the other almost orthogonal dimension is reactivity and we will talk briefly about what this means and what programs written for this model computation would look like. And the work that was done to make sure that we have stronger guarantees about a reactive programming model and in some sense we are gonna be combining these two approaches at the end of the day that is essentially the automate goal of research that we are doing.


1. (PDF) Presentation of Walid Taha on Resource-Aware Programming (RAP)

2. (HTML) Course description of Resource Aware Programming at Computer Science, Rice University

Resource Aware Programming


One Response

Subscribe to comments with RSS.

  1. […] MetaOCaml-Resource Aware Programming May 2009 4 […]

    2010 review of the blog «

    January 2, 2011 at 12:58

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s