Entradas

Lin Clark on WebAssembly

Today I will talk about the podcast “Lin Clark on WebAssembly” by Software Engineering Radio in the year 2018. First they started the interview asking Lin Clark about her life, what she did and what she is actually working on. Then she started talking about WebAssembly, and how it is faster than javascript because this was not created to be so. That’s why WebAssembly optimizes the efficiency of JS when running on browsers.  She said that the original use case of WebAssembly was visualization, but now is being used in other use cases. This language it's not an assembly language because it is targeting a conceptual machine, it’s a compiler target. So we write in a language that compiles to WebAssembly. When using it most people use C/C++ or Rust, those languages are way more subtle to it because they don't have garbash support and using another language will mean writing a garbage collector which I don’t think anyone will want, not even the interviewer. Something that I did

Building Server-Side Web Language Processors

Today I will talk about the article “Building Server-Side Web Language Processors” by Ariel Ortiz. The main topic of this paper is to consider a web approach to compilers design teaching method. First the author introduces todays process of teaching this course, in which students are able to create a very small but yet fully functional compiler at the end of the semester, which the author considers that there is a lot of subjects and concepts left to teach, to solve this Ariel suggests a web-based architecture compiler for this course to be teached. Despite the advantages this course can have by implementing a web approach, the author creates that many of the students by the time they graduate, will become web developers, so this way of teaching compilers design will help students also to understand the web architecture of an application. This concept is basically to have a server that, by establishing rules and apis in order to make the web-based language useful enough, is able

Ruby and the Interpreter Pattern

Today I will talk about the article titled: “Language Design and Implementation using Ruby and the Interpreter Pattern”  by our professor Ariel Ortiz. The article talks about an implementation he made to evaluate different LISP expressions that are given as strings using a framework called S-Expression Interpreter Framework, and gives some examples with Ruby code to demonstrate how it works. The principle for the interpreter pattern is that it's easier to solve some problems by creating a specialized language to understand it, and then express the results in that same language. This is all involved in both the syntactic, but more importantly the semantic analysis of compilers (the phase we are currently working in in our project). It involves a data structure that is called Abstract Syntax Tree (AST Tree), where the different operands and hierarchy of functions are organized in order to create a logic and order to the thinking and executing processes of the machine. After the

Mother of Compilers

Today I will talk about the biography of Grace Hopper and her role in the development of the first compilers. The first one, the article, talks specifically about Grace Hopper as not only the responsible for the Cobol language development, but also for the constant pressure exerted to the industry for computer development and making it accessible. One of her reasons for this was to bring the research and career woman’s interests in computing to the forefront. Because in those times it was difficult for women to find interest in some fields that were “only for men” so she became the most famous and important software experts of the navy. I can say as a woman I really admire her courage for standing out in a patriarchal world. One thing that I didn’t know is that Grace Hopper was the first person using the term bug, one of the most software development terms or concepts used nowadays. The second one, the mini-documentary, talks about exactly the same. It’s like both the article and

Internals of GCC

Today I will talk about the podcast “Internals of GCC”  by Software Engineering Radio with Morgan Deters as a guest. This podcast talks about compilers and how they internally work. It covers all the steps of a GNU Compiler Collection construction (in this case GNU), going from parsing different programming languages to machine optimizations and processor binary code generation. At the beginning I found the recording a little tedious, however as it continued I learned a bit more about the subject. I learned a lot of things I never assumed with GCC, besides the fact I've used it for many projects in C. For instance, I thought it was compatible only for C programs, but I found out that it also works for C++, Java, and some other languages. On the other hand, it was very interesting to learn about the compiler working process: it  goes through three different phases in which source code is transformed into target code for the computer to understand it and execute it correctly.

The Hundred-Year Language

Today I will talk about the article titled: “The Hundred-Year Language” by Paul Graham. The author started with a great comparison saying that in the same way as species, the languages including computer or programming languages form a kind of evolutionary tree, where some of them end as dead-ends branches. A clear example is Cobol that, despite its past popularity, it doesn’t seem to have any descendant, fact that automatically converts it into an evolutionary dead-end branch. I bet that in the great Cobol days, nobody thought that it wouldn't evolve, nor have descendants, nor be commonly used. I think that an interpreter is a reduced version of a compiler, that is an interpreter has similar steps that a compiler has. The difference I can notice is that a compiler generates intermediate code, while the interpreter executes a line instruction. But the main feature of both is parsing and processing input using tokens and syntax analysis. It is very interesting when you realize

Making Compiler Design Relevant for Students

Today I will talk about the article titled “Making Compiler Design Relevant for Students who will (Most Likely) Never Design a Compiler”. Basically, the author talks about the principles, techniques, and tools used during compiler design courses and how those aspects are applicable to a great variety of situations that aren’t exactly considered as compiler design. He mentions different processes that take place through the compilation beginning with the lexical analysis, it explains the compiler goes character by character recognizing all the special chars. This process is really expensive, therefore there are a lot of processes and methods to make this a more efficient and fast work. Then it all goes through the syntax process, where it is all further processed and recognized as words or phrases which are used to understand the code. Regular expressions are particularly useful in this phase. Then comes the more complex semantic phase, where according to the grammar specification