Enseignement 2023-2024 : Structures de contrôle : de « goto » aux effets algébriques
Séminaire du 22 février 2024 : Compiling with Continuations

Intervenant : Andrew Kennedy, Meta

Retrouvez les enregistrements audios et vidéos du cycle et son texte de présentation :
https://www.college-de-france.fr/fr/agenda/seminaire/structures-de-controle-de-goto-aux-effets-algebriques

Chaire Sciences du logiciel
Professeur : Xavier Leroy

Retrouvez tous les enseignements du Pr Xavier Leroy :
https://www.college-de-france.fr/chaire/xavier-leroy-sciences-du-logiciel-chaire-statutaire

Le Collège de France est une institution de recherche fondamentale dans tous les domaines de la connaissance et un lieu de diffusion du « savoir en train de se faire » ouvert à tous.
Les cours, séminaires, colloques sont enregistrés puis mis à disposition du public sur le site internet du Collège de France.

Découvrez toutes les ressources du Collège de France :
https://www.college-de-france.fr

Suivez-nous sur :
Facebook : https://www.facebook.com/College.de.France
Instagram : https://www.instagram.com/collegedefrance
X (ex-Twitter) : https://twitter.com/cdf1530
LinkedIn : https://fr.linkedin.com/company/collègedefrance

[Music] [Music] new so I’m going to switch to English so we have the pleasure to welcome Andrew Kennedy uh who works at meta in at London in London um so Andrew is uh well he’s an expert in functional programming well programming languages and their implementations and especially functional languages um so when he was at Microsoft research he was I guess the main designer of generics in the net virtual machine so if you have generics in C today you should thank Andro for that um but he’s also been um uh working U kind of on uh in other projects especially on uh his standard ml compiler which uh uses uh continuations as uh inside as an intermediate representation so he’s going to talk about that and here he’s been he he’s found other uses for for continuations in programing language implementation recently so thank you Andrew thank you thank you zavier for inviting me and for that very nice introduction and for a lovely talk um I Tred to understand the French but at least the okam was in English so not so hard okay yes so I’m going to talk about um compiling but also typing with continuations um so as um as you know U continuation ations are one of the uh best and oldest ideas in computer science um with many many applications here are a few um so we’ve already seen I think with with Javier’s talks um some of this I mean I guess the semantics is is one place um but also I’m going to talk a little bit about compiling um but it’s it’s a generally useful concept for for programming as well as uh for programming language Theory um so I’m going to give a couple of examples before I launch into my main topics um so here’s one sort of thinking in the kind of continuations mode so this is something I did a few years ago with Nick Benton um at Microsoft research um so we were looking at exceptions um and if you look at a camel’s try with construct it’s you know it’s it’s the usual way one does exceptions in a functional language I mean standard ml did the same um but sometimes it’s kind of clumsy to use and it’s hard to put your finger on why or what would work better um and if you look in this paper we give examples of pro from programming but also from semantics for why this construct is is not quite maybe what you always want uh and the solution was really to sort of think in terms of continuations and um zavier has already kind of introduced this in a way because the the um effect Handler construct is very similar to to what what I’m showing here so um so a better behaved construct for exceptions has kind of a failure and a success continuation um which is what you would get from double barreled CPS um or as I learned today the French what was it ad learned some new French um so perhaps one wants a generalized let construct instead of the existing Tri with where you have an exception Handler so you’re binding a variable to some expression but you have an exception Handler for the failure case as well as a continuation for the success case um in fact in oaml 10 years ago they introduced this generalized match construct which serves the same purpose so we have matching um on successes and then an exception to handle failure so you’ve got multiple continuations that’s one example of sort of thinking continuations when working in programming language design um another example which something I did uh a few years ago also at Microsoft on machine code um we wanted to come up with some sort of program Logic for machine code now the usual way you do program logic um you know dates back to to whole logic from the from the 70s where you have these triples so you have a some code which is C and then a precondition and a post condition so this kind of means if you know if the program State satisfies some predicate P when when you enter the code then it will satisfy Q on exit from the code um but another way to think about this is breaking it down into something a little more primitive which is what we did for for machine code um where you have a primitive notion of it’s safe to run from a certain point in your program under some precondition p and let’s write that as you know safe LP safe to run from L under P but then we can recover the the kind of triples just by doing a kind of continuation passing thing which is we we say it’s safe to run from the exit of the code um with some um state invariant q if it’s safe to run from the entry uh and we used this to give um a program Logic for machine code sort of thinking about you know this continuation like notion so I’m going to talk about two things today the main topics of the talk the first is compiling with continuations which we’ve seen a little bit already I think with all of the CPS translations that um zier has presented um but also typing with continuations uh which actually dates from my uh time at meta so I’m going to show you something that we’ve actually used it for at meta um but first i’m going to give a kind of overview of compiler intermediate representations these are the you know the what happens inside a compiler the kind of rep the the languages that we use um to represent um translations from source code all the way down to whatever it is you’re targeting and for functional languages um one could just work with some kind of desugared abstract syntax um or maybe something more like the the Lambda calculus but with without any particular restrictions on how Lambda calculus is used what more popular in recent years is some kind of reduced form of Lambda calculus where um you have kind of names for everything this is called ANF a similar language um based on monadic um on monads um you might call a monadic Intermediate Language and then finally continuation passing style which used to be popular for compiling functional languages less so now um for imperative languages again you might do some kind of dhug abstract syntax but it’s more likely that you’re going to um use some kind of flow graph with you know local variables or perhaps uh what’s most popular these days for compiling imp of languages is called SSA I think Deli deor um gave a talk about that last week static single assignment and variants on it such as gated SSA um but again you can use CPS to some sort of CPS language or CPS transformation to um compile imperative languages and you can sort of see the you know the the first bullet in in these examples is is much closer to the source code and the one at the bottom say CPS is much further away it looks quite different but let’s start by thinking about functional languages so ANF so this is Administrative normal form a very clumsy term really but um and there’s there’s a history for why it’s called that but essentially you’re naming every intermediate computation so here in just a camel syntax if you take say the the recursive map function and then translate it into something like ANF you’re basically giving a name to you know applying f2x and you’re giving a name to the result of calling the uh map function recursively um so every intermediate computation gets a name and monadic intermediate language is similar um except that lets can be nested so you can have let y equals let something um and monads provide a kind of place to hang effect information so they’re popular when you’ve got some kind of effect analysis over the program but again the the order of evaluation is made explicit that’s the point um you’re naming every intermediate computation so you can see what order things are evaluated now continuation passing style also gives you this um but in a slightly more complicated way you are basically every function takes an additional parameter which is the continuation um where the result gets passed and so you can see in a cam syntax again if you CPS convert the map function then you you’re making the the order of evaluation explicit by passing a continuation for every you know in computation now the transformation is easy to write down um but it kind of explodes the code with a lot of new functions so there’s a lot of lambdas in there that weren’t there before um and it kind of messes with your head to think about um but compilers don’t care I mean you know um it’s just difficult to read as a programmer so why why compile using CPS rather than one of these other um intermediate forms which seem a bit simple I meanf is pretty readable and and comprehensible so I think historically uh for it’s when languages that support first class control operators like call CC um so examples of that would be scheme and typed Racket and um standard ml New Jersey there’s a very straightforward CPS implementation or translation of of these first class control into um I mean it’s essentially I mean when I say a CPS language that’s slightly misleading it’s kind of the Lambda calculus but you’re doing this transformation from the Lambda calculus into the Lambda calculus but with these additional continuation lambas um and the the second reason uh and this was advocated in Andrew appel’s book compiling with continuations a few years back um every function call becomes a tail call so we don’t actually need anything like a call stack because every function kind of just jumps to the next one um and so all functions um become closures which we just allocate on the ordinary Heap so instead of Stack frames we have these these environments on the Heap um but you might be concerned that that comes with a performance cost because the stack generally is implemented very efficiently on you know on a microprocessor um and having to allocate on the Heap may not be so efficient and there was a long debate a few years ago about you know whether this was really the case um but I’m actually going to be using continuations for slightly different purpose which is that CPS translation makes for a very uniform simple Intermediate Language in which we give names to every intermediate value but also every control point in the program so some static analyses this was my experience at least with the work that zavier mentioned um that I did at Microsoft um they become much more straightforward um and some optimizations and you’ll see one later fall out really easily from once you’ve done this this translation um there are a few sort of problems as well with the other approaches so a ANF um which CPS the CPS language I describe doesn’t suffer from so and this is kind of well known in the in the literature but you kind of have to get round it if you if you’re using NF um for your Intermediate Language and one is that it’s not closed under inlining so here here’s an example you’ve got um a a Lambda that’s immediately applied to some value and obviously you you’d want in your compiler to to inline the function um so you beta reduce or in line and you end up with something which strictly speaking is not now in umf in a administrative normal form um because that doesn’t allow nested lets so you then have to do this SEC this step of kind of renormalizing which you know you move the lets around to bring it back into ANF strict ANF form that seems straight forward but it gets worse if you have conditionals on the right so here suppose you had a conditional in the body of the Lambda and now we’ve got a conditional on the on bound to a let which again is not in not in ANF and how do we renormalize that well there’s two ways we can renormalize it one is by actually duplicating the I’m going to say the word continuation because that’s kind of what it is M here so we end up with two copies of M in the in the term that’s not good we end up with you know code bloat or we introduce a Lambda to represent um what’s sometimes called a join point so here we’re we’re sort of abstracting out this m and then and then calling it in both branches of the conditional now I’ve called that K suggesting a continuation because that’s kind of what it is this join point so I’m going to come back to the CPS language now and the idea here is that I’m not going to use continuations to represent first class control like call CC or delimited continuations or all of these fancy features I’m just going to use them to compile an um an ordinary language without these features um but I’m going to name all continuations and they’re going to be distinguished ordinary functions so they’re not General lambdas so in this kind of suggestive syntax of okam I’ve put let cont here so the CPS translation introduces these continuations that have been given names so it ends up looking rather contorted and kind of weirdly nested but I mean for a compiler that’s fine so we’ve got this local continuation definition and a continuation application which are kind of syntactically different from ordinary function application and ordinary um Lambda definition and there is ordinary function application which takes an addition argument which is the the continuation okay so this was work I did a few years ago um so I had of this paper compiling with continuation continued um and I’m kind of pleased to see that it’s kind of made it it’s we into practice now uh so there’s this work on the oaml compiler that uh some of which is carried out by Jane Street and D camel Pro uh and they’ve got an intermediate language called f Lambda 2 that essentially looks like the the language that I proposed and there was a presentation at the OK camel um developers and implementers work U I can’t remember what the workshop is called now the aamel workshop last year okay so here’s a very small version of that Intermediate Language so we have values which are pairs say and um con data Constructors and lambdas and you can see that the lambdas um have taken additional uh parameter which is return continuation so all values are named um and this is the local continuation definition that you saw in the camel code on the previous slide and there’s an explicit return continuation for every every function and then like I said separate constructs for continuation application and function application with a return continuation now there’s a kind of design Choice here um so in the paper I had even the branches on like a match Constructor or a conditional are named that’s kind of a design Choice it’s not quite so important for for what comes but so just to summarize again the features are all all intermediate values are named and all control points are named and a couple of consequences for the from this one is that we only ever substitute variables for variables we don’t have substitute uh actual values in and the thing that I introduced to to deal with um to avoid duplicating when we were trying to float a c you know conditional through through a let um a join point they’re present from the start so when you do a CPS translation into this language you’ve got join points there already but continuations are second class uh in in the sense that they can’t be they can be passed to functions um but they can’t be returned they can’t be stored in data structures and they can’t be accessed from functions out of function Scopes but a consequence of that is that they just represent blocks of code so a definition is just a block and continuation application is just a jump there’s a kind of open question which I’ve never really resolved um I don’t know whether other people have since since I did the work is how you kind of Nest the continuation definitions so there is actually a similar language which is used in the um compiler called Milton where you know they have things that look like continuations um but they’re they’re closed completely closed they just take a bunch of parameters and there’s no few variables and they’re defined at the top but you in the language I described there’s a kind of choice you could push them all the way down uh minimize the number of parameters they take and I’m never sure really what what works best that’s something I’ve not really resolved but it is closed under inlining uh this is this isn’t the language really that I described this has got not not named continuations but you can see that basically I’m just doing beta reduction for the argument and the continuation uh from the this represents the the code we saw earlier um but here’s here’s the example of conditional that we had earlier translated into my language um with by by CPS so the point is that when we reduce say that I effectively we’ve got this Lambda F which is bound to F and then we’re applying it to J and C here and we can inline those inline the whole definition into into here um but we’ve already got this join point from the original CPS translation so there’s no need to introduce something to avoid duplicating terms it’s just there so the whole thing the form is kind of closed under under inlining um so I’m going to talk a little bit about how you would actually implement this in a in a real compiler because I think that’s kind of interesting as well so you could implement it just in a usual kind of functional style where you have like an algebraic data type to represent the Expressions um and then you do Transformations by essentially making a new copy but with with Deltas now that’s generally expensive because you’re doing a lot of copying um and an alternative is actually to represent the the CPS language as a graph and do kind of update in place and this can be extremely efficient so this is a um adaptation of some ideas from a few years ago by andreal and Trevor Jim uh and you can make substitution constant time um you’re just substituting variables for variables and if you want to do a whole bunch of rewrites that do simple things on terms to to compile them then though that can take just time linear in the size of the term which is much harder if you you you know you’re trying to copy things it’s not going to happen um and there’s three ingredients there’s it does get complicated and it’s quite hard to implement and get the invariance right but it does produce very efficient um compilation so you have a kind of a tree for the basic structure of the the the terms which is doubly linked so you can go back up as well as down and then for the variables variables you link the the bound variable for the Lambda with all of it three variables in a in a circular list and then finally in order to get from the three variables to the bound variable without having to navigate this circular list you can use a union find data structure um so it looks something like this so this is an example where you’ve got links from I think you can see the circular list yep so if you look at P um there’s a link from the bound variable P here down to its uses and then a circular list joining all the uses um and then Leed on top of that is actually um a union find data structure which means you can get from the occurrences to the bound variable very efficiently and all of this means that you can do operations essentially in constant time um such as substitution so I guess this is the kind of thing you put at the start of the talk but this is what the compiler pipeline would look like and I just kind of want to point out that we’re doing something slightly strange which is we’re taking what is a direct style language without high order without first class control operators like a camel we’re CPS transforming it into this continuation passing style IR we’re doing a bunch of optimizing Transformations and then typically we’re going to do a kind of inverse CPS trans transform to get to the Target which might be abler say um in the case of the work that I was doing it was a it was the the Intermediate Language of of the net runtime but that’s direct style it doesn’t have continuations built in or anything um so you the the you’re going to a lot of effort um without getting the kind of benefits that are traditionally ascribed to CPS like being able to do for call CC but the point is that these optimizing Transformations are this is a kind of good place to do them this this this IR so I’m going to show you one of those Transformations which is called quantification that is really quite easy to do uh when you’ve got this CPS language but much harder to do without it okay so here’s an example piece of of voam code so if you look at this you can see that the right hand side of these branches um yeah the point is it’s always going to return to the same place okay so it’s just the result of the match is then applied is then passed to G um and both branches are are calling F and so you can actually turn this into something which doesn’t need a Lambda doesn’t need a function definition it can it just needs a continuation a basic block and be compiled much more efficiently now there was work on this by um Matthew flu and step weeks a few a few years ago uh where they used some kind of Dominator analysis to determine it but it it it just kind of shows up straight away when you CPS transform it into the language I described so let’s take this this program and CPS transform it and what you notice is that every use of f the the function gets past the same continuation okay K Prime um which means that you can inline that continuation so it’s just like a common argument but first we need to kind of hoist it so that it’s in scope um of the function because you know K Prime is defined below here but if we hoist it up here then it’s in scope and then we can just replace the function with a continuation and that’s and substitute in the continuation for the for the function call so instead of a function call now we’ve got a a continuation call which is just a jump Okay so we’ve turned what was a more expensive thing to to implement a function call by something that’s very cheap just a jump and this generalizes to mutually recursive functions but it’s really just a kind of common argument elimination um and you can basically iterate this reduction this this optimization on on you know a whole program um which gives you what uh Steven uh flu um Matthew FL and step weeks called optimal quantification but it’s a completely local transformation whereas they had to do this Dominator analysis and you know for for real programs surprising number of functions can be turned into continuations okay um so now I’m going to say just a little bit about imperative languages uh having looked at functional ones because I did mention that CPS can be use for those two um so the most common intermediate form that people use to compile um imperative languages like C say is um SSA and the point of SSA is that every variable is defined before it’s used and assigned exactly once so it’s a bit like a let but when you’ve got join points in your program they there’s these five functions that tell you well I find them very hard to explain and I think many people do um they’re not functions because they’re basically going to I mean this this join Point here it says i1 will get I2 if I came down this path and it would got I3 if I came down this path so it needs to know what was the flow uh so it’s not really a function at all um there’s something kind of odd uh sort of upside down about them um now it is possible to to give them semantics I mean this is the work of delin Deon and others um but another thing is that like with the ANF form for functional languages when you do function in in lining you have to recompute the SSA form so it doesn’t it’s not closed under function in lining so a few years back Andre pel observed SSA is functional programming well SSA written slightly differently where you You observe that you can just Define functions in in the language that I described continuations that then get past different arguments depending on the control flow okay so you’ve kind of inverted things instead of a f node you’ve just got the choice of of argument at the point that you jump to the continuation so these are like blocks with parameters suppose and this fits perfectly in the CPS language that I described um function inlining doesn’t destroy well forness um there’s this invariant of SSA dominance invariant and that’s just sort of scoping and you can also Express Loop structure just using a recursive version of continuations what I don’t know is whether anyone has taken seriously I mean SSA still seems to be very popular but there are with variants of it like gated SSA um I actually don’t know of any anyone using this kind of CPS based language for compiling imperative languages okay so that’s the sort of first half of my talk completed I’m now going to talk about something I’ve done more recently at meta um which you could characterize as typing with continuations so I have to introduce a few things this is stuff that hasn’t really made it outside of meta very much so we have this language called hack so if you go to the Facebook website um or you access Facebook on your phone then there will be some hack code running on a server somewhere um at meta it is the most used language at Meta Even though it’s not used outside just a little bit it is open source but it’s not had much take up and it’s based on PHP which is a terrible language um this is why they created hack to kind of improve on PHP um and it runs on um hhpm which is a jit compiled runtime based on bite codes a bit like the JPM um and programs are checked by hacks kind of whole program type Checker which is actually implemented in OK camel some of it’s implemented in Rust now and millions of lines of P PHP have been migrated to hack um and there’s been a whole bunch of bad features of PHP taken out and good features added so static typing a and async weight style um concurrency are are the major things I think um and it’s continually being developed but what I’m going to talk about now is is how kind of type inference works for local variables um so there’s a a rich type system in in Hack That borrows ideas from various other languages so there’s a kind of subtyping based on classes and interfaces and traits there’s a a notion of nullable types um there is generics or parametric polymorphism as you’d find in Java or C um and there’s a whole something that you wouldn’t find there is structural subtyping so function types and record types which we call shapes um quite quite a rich collection of uh of types and it has local variables but they’re kind of weird uh compared to say what you’d find in in a camel because there’s no scoping there’s no declaration you don’t have to declare the declare the type so there’s nowhere even to to hang the the type um of a local variable it just kind of Springs into existence when you first assign to it um so you can think of it as SC across the whole function I guess um it’s runtime type typically changes during execution so there are types available at runtime and these can be tested dynamically that that’s kind of not really relevant to the what I’m going to describe um because they don’t have declared types in fact there’s nowhere to put the type we have to infer them so hack type Checker INF first types for locals and this is flow sensitive so we’ll see this in a minute with some examples um so join points in the program we find the upper bound of types we actually have Union types in the language and types can also be refined by some testing some Dynamic testing so here’s three programs that illustrate this flow sensitivity okay so let’s have a look at this first one so it takes in a ball so we declare Types on the on the parameters at least the typee Checker doesn’t have to infer those and then in this uh conditional block we assign a string to it and then we call some other function and then we assign an INT okay but in this Branch we’ve assigned a string just assign a string so you know what’s the type of X at this point well hack will infer a union type forid which is into or string um and this type mixed is the kind of top type so like in Java be Java Lang object um and this Union type is a subtype of mix so this program is correct this it’s type correct now let’s look at this function which involves a loop so it takes in an integer and assigns a Boolean to S and then enters a loop um but there’s possibly a break from the loop okay and inside the loop it assigns a string to s um so what happens here well you know if we analyze this we can see that if this break is taken then this um this assignment never happens so this could still be a Boolean so this is actually typ in correct and hack will flag an error to say you know you’re trying to return something which might be a Boolean um maybe slightly stranger still is this example that involves exceptions so here we’re assigning some new object of of class Fu to some variable F and then we enter a tri block and call some function um and then if if an exception is caught then we assign some different value to some differently typed value to to f so at this point when we try and call a method on it hack has to know well this could be a foo or it could be a bar so it had to analyze the control flow to determine that okay so how do we formalize this this idea of flow sensitivity for types and you won’t be surprised that I’m going to use continuations um I think in an earlier lecture of of zavet he talked about imperative programs and the fact that you know at any program point there may be many continuations and that’s the idea we useing here so there’s there’s usually a next next statement not always but there usually is uh that’s the usual continuation which we call next but if you’re inside a loop there might be a break continuation and a continue continuation to return to the top of the loop um in a tri block there’s a catch continuation and we also have TR finally so there’s a number of continuations that there may be at any point in the program uh so here’s a small uh subset of of the hack language so we’ve got some types um Bull in and mixed we’ got some expressions and then some statements and the statements could be assignments uh that’s just empty um sequencing conditional and we’ve got break and continue and Loops okay and we’re assuming there’s some kind of subtyping relation between types so for typing Expressions you do the kind of usual thing which is you define a context that tells you what the types of the locals um are with um that you’re going to type that expression respect to so an example would be you know X has Type int and Y has type bll or string and then you can define a judgment that tells you what type an expression has given some context of of local types um fortunately you’re not allowed to do side of well you’re not allowed to assign inside an expression so um there apart from being able to throw exceptions um they’re you know they’re mostly pure for for local typing but then for statements what we do is we Define another context which is all the possible continuations and the continuations themselves have a bunch of local types so this is the you know what would be valid if you jump to that continuation um or sorry what the continuation has got to be safe assuming those types of the locals so here’s an example um the next continuation assumes that X has Type int but but the break continuation assumes that X has Type string and Y has Type ball um so then when we Define typing for statements we’re basically it’s a safety statement we’re saying that s is is safe to run if the locals have types given by gamma and the continuations have types given by Delta um and then we can Define rules for say sequencing um so the empty block well that’s safe um given uh a typing for locals and the same typing for the next continuation locals haven’t changed um whereas if you’re sequencing then the next continuation um had better um be compatible with what what’s coming basically um so I mean it has the kind of flavor of a program logic I suppose in a way um and you can even detect unreachable code using this so if there’s no next continuation then in in the context then we can do anything after the semicolon we know that s um doesn’t continue to the next continuation so for example it might it might have thrown an exception or gone into an infinite Loop or something we can actually get that from the typing um so here’s a rule for assignment where there uh this this is um you know this is safe to run if the next continuation has updated the the the um the local typing to be what e the type of e um um conditionals are pretty straightforward because you you’ve basically got to have the same typing um for both statements same continuations um there are weakening rules as well which I’m not showing you here so you can always you know you can always weaken the assumptions um and rules for Loops um are can also be expressed in this so how do we Implement um inference for this this system well we take a statement and we infer given a typing for locals what the type of the continuations that would make that statement safe should be uh so this is essentially you know the sort of weakest context for continuation types such that this you know the safety judgment holds um and yeah I mean you can compare this to Notions in in in program logic um so here’s here’s an example for conditionals so if you take the conditional then you’re going to infer you’re going to run inference on the the the then branch and you’re going to run it on the else branch and that will give you back two continuation environments that are are you know represent what’s safe for those statements to run and the result of the whole thing is then some kind of join of the environments and this is the point at which we’re going to compute say unions for the uh types of the locals and there’s similar kind of process for for all the all the features of the language so in practice um hack type INF actually uses three main techniques there’s disc continuation based approach to deal with locals um but for and you saw it’s got quite a rich type system there generics and the subtyping and it basically does some kind of constraint solving for that and it also does a little bit of bidirectional type checking for lambdas um and it works I mean we’ve got tens of millions of lines of hack code and and people will make a change in their VSS code um know the IDE and expect the type Checker to come back quickly um and it does that because we do a sort of incremental checking it’s run in parallel um and now actually it’s run distributed um and and it’s efficient but it’s kind of built on these these inference techniques okay so um that’s pretty much it um I’m going to leave you with some references um so what introduced ANF was this paper the essence of compiling with continuations and then the work that that that I did a few years ago compil continuations continued I don’t want you to get the impression that people uh are in agreement that this is the right way to do compilation because there are alternative views here’s one compiling without continuations basically takes takes what what we did and says well maybe not um the exceptional syntax uh that I mentioned earlier the SSA is functional programming that’s observing that fin noes can be represented by local blocks um and this this notion of quantification okay thank you very much [Music]

Share.
Leave A Reply