Compare commits
46 Commits
viewRender
...
master
Author | SHA1 | Date | |
---|---|---|---|
22e14bbb3f | |||
a7a5018f38 | |||
4cde5179f1 | |||
ec443899c3 | |||
3ef69920c7 | |||
9139d4830d | |||
1a13c79ee4 | |||
da5d171479 | |||
c2adfa3b46 | |||
360857d506 | |||
21c91731e9 | |||
7d0fcbf28a | |||
|
2be865181d | ||
|
ebf57591a8 | ||
|
6257495fe4 | ||
|
3a2423a937 | ||
|
223b7f93a5 | ||
|
2a96e9a593 | ||
|
9c48232ac1 | ||
|
c4dd673bf4 | ||
|
181c392219 | ||
|
f5f0f6e436 | ||
|
e7991adfaa | ||
|
6040abc836 | ||
|
ec3218e2d0 | ||
|
c5aa582226 | ||
|
33e59a3836 | ||
|
82e74cb55f | ||
|
e2ffc37ddc | ||
|
1e30ad6959 | ||
|
e4bdeb8797 | ||
|
b1831a670f | ||
|
066d5a8319 | ||
|
f530cb481a | ||
|
3f28c60ab8 | ||
|
fed2c35868 | ||
|
ad1e99585b | ||
|
16f2b1bbde | ||
|
a5b3b7acd0 | ||
|
0fe15409a8 | ||
|
b51935fcd1 | ||
|
20b2a80a3c | ||
|
9534ff5c13 | ||
|
c1bdb46623 | ||
|
91c0629377 | ||
|
132a50039b |
5
BUILD
5
BUILD
@ -1,5 +0,0 @@
|
|||||||
RELEASE=RELEASE_381 # this may have to be changed based on llvm version
|
|
||||||
svn co https://llvm.org/svn/llvm-project/llvm/tags/$RELEASE/final $GOPATH/src/llvm.org/llvm
|
|
||||||
cd $GOPATH/src/llvm.org/llvm/bindings/go
|
|
||||||
./build.sh
|
|
||||||
go install llvm.org/llvm/bindings/go/llvm
|
|
431
NOTES
431
NOTES
@ -1,431 +0,0 @@
|
|||||||
Been thinking about the stack and heap a lot. It would be possible, though
|
|
||||||
possibly painful, to enforce a language with no global heap. The question really
|
|
||||||
is: what are the principles which give reason to do so? What are the principles
|
|
||||||
of this language, period? The principles are different than the use-cases. They
|
|
||||||
don't need to be logically rigorous (at first anyway).
|
|
||||||
|
|
||||||
##########
|
|
||||||
|
|
||||||
I need to prioritize the future of this project a bit more. I've been thinking
|
|
||||||
I'm going to figure this thing out at this level, but I shouldn't even be
|
|
||||||
working here without a higher level view.
|
|
||||||
|
|
||||||
I can't finish this project without financial help. I don't think I can get a v0
|
|
||||||
up without financial help. What this means at minimum, no matter what, I'm going
|
|
||||||
to have to:
|
|
||||||
|
|
||||||
- Develop a full concept of the language that can get it to where I want to go
|
|
||||||
- Figure out where I want it to go
|
|
||||||
- Write the concept into a manifesto of the language
|
|
||||||
- Write the concept into a proposal for course of action to take in developing
|
|
||||||
the language further
|
|
||||||
|
|
||||||
I'm unsure about what this language actually is, or is actually going to look
|
|
||||||
like, but I'm sure of those things. So those are the lowest hanging fruit, and I
|
|
||||||
should start working on them pronto. It's likely I'll need to experiment with
|
|
||||||
some ideas which will require coding, and maybe even some big ideas, but those
|
|
||||||
should all be done under the auspices of developing the concepts of the
|
|
||||||
language, and not the compiler of the language itself.
|
|
||||||
|
|
||||||
#########
|
|
||||||
|
|
||||||
Elemental types:
|
|
||||||
|
|
||||||
* Tuples
|
|
||||||
* Arrays
|
|
||||||
* Integers
|
|
||||||
|
|
||||||
#########
|
|
||||||
|
|
||||||
Been doing thinking and research on ginger's elemental types and what their
|
|
||||||
properties should be. Ran into roadblock where I was asking myself these
|
|
||||||
questions:
|
|
||||||
|
|
||||||
* Can I do this without atoms?
|
|
||||||
* What are different ways atoms can be encoded?
|
|
||||||
* Can I define language types (elementals) without defining an encoding for
|
|
||||||
them?
|
|
||||||
|
|
||||||
I also came up with two new possible types:
|
|
||||||
|
|
||||||
* Stream, effectively an interface which produces discreet packets (each has a
|
|
||||||
length), where the production of one packet indicates the size of the next one
|
|
||||||
at the same time.
|
|
||||||
* Tagged, sort of like a stream, effectively a type which says "We don't know
|
|
||||||
what this will be at compile-time, but we know it will be prefixed with some
|
|
||||||
kind of tag indicating its type and size.
|
|
||||||
* Maybe only the size is important
|
|
||||||
* Maybe precludes user defined types that aren't composites of the
|
|
||||||
elementals? Maybe that's ok?
|
|
||||||
|
|
||||||
Ran into this:
|
|
||||||
https://www.ps.uni-saarland.de/~duchier/python/continuations.htm://www.ps.uni-saarland.de/~duchier/python/continuations.html
|
|
||||||
https://en.wikipedia.org/wiki/Continuation#First-class_continuations
|
|
||||||
|
|
||||||
which is interesting. A lot of my problems now are derived from stack-based
|
|
||||||
systems and their need for knowing the size input and output data, continuations
|
|
||||||
seem to be an alternative system?
|
|
||||||
|
|
||||||
I found this:
|
|
||||||
|
|
||||||
http://lambda-the-ultimate.org/node/4512
|
|
||||||
|
|
||||||
I don't understand any of it, I should definitely learn feather
|
|
||||||
|
|
||||||
I should finish reading this:
|
|
||||||
http://www.blackhat.com/presentations/bh-usa-07/Ferguson/Whitepaper/bh-usa-07-ferguson-WP.pdf
|
|
||||||
|
|
||||||
#########
|
|
||||||
|
|
||||||
Ok, so I'm back at this for the first time in a while, and I've got a good thing
|
|
||||||
going. The vm package is working out well, Using tuples and atoms as the basis
|
|
||||||
of a language is pretty effective (thanks erlang!). I've got basic variable
|
|
||||||
assignment working as well. No functions yet. Here's the things I still need to
|
|
||||||
figure out or implement:
|
|
||||||
|
|
||||||
* lang
|
|
||||||
* constant size arrays
|
|
||||||
* using them for a "do" macro
|
|
||||||
* figure out constant, string, int, etc... look at what erlang's actual
|
|
||||||
primitive types are for a hint
|
|
||||||
* figure out all needed macros for creating and working with lang types
|
|
||||||
* vm
|
|
||||||
* figure out the differentiation between compiler macros and runtime calls
|
|
||||||
* probably separate the two into two separate call systems
|
|
||||||
* the current use of varCtx is still pretty ugly, the do macro might help
|
|
||||||
clean it up
|
|
||||||
* functions
|
|
||||||
* are they a primitive? I guess so....
|
|
||||||
* declaration and type
|
|
||||||
* variable deconstruction
|
|
||||||
* scoping/closures
|
|
||||||
* compiler macros, need vm's Run to output a lang.Term
|
|
||||||
* need to learn about linking
|
|
||||||
* figure out how to include llvm library in compiled binary and make it
|
|
||||||
callable. runtime macros will come from this
|
|
||||||
* linking in of other ginger code? or how to import in general
|
|
||||||
* comiler, a general purpose binary for taking ginger code and turning it
|
|
||||||
into machine code using the vm package
|
|
||||||
* swappable syntax, including syntax-dependent macros
|
|
||||||
* close the loop?
|
|
||||||
|
|
||||||
############
|
|
||||||
|
|
||||||
I really want contexts to work. They _feel_ right, as far as abstractions go.
|
|
||||||
And they're clean, if I can work out the details.
|
|
||||||
|
|
||||||
Just had a stupid idea, might as well write it down though.
|
|
||||||
|
|
||||||
Similar to how the DNA and RNA in our cells work, each Context is created with
|
|
||||||
some starting set of data on it. This will be the initial protein block. Based
|
|
||||||
on the data there some set of Statements (the RNA) will "latch" on and do
|
|
||||||
whatever work they're programmed to do. That work could include making new
|
|
||||||
Contexts and "releasing" them into the ether, where they would get latched onto
|
|
||||||
(or not).
|
|
||||||
|
|
||||||
There's so many problems with this idea, it's not even a little viable. But here
|
|
||||||
goes:
|
|
||||||
|
|
||||||
* Order of execution becomes super duper fuzzy. It would be really difficult to
|
|
||||||
think about how your program is actually going to work.
|
|
||||||
|
|
||||||
* Having Statement sets just latch onto Contexts is super janky. They would get
|
|
||||||
registered I guess, and it would be pretty straightforward to differentiate
|
|
||||||
one Context from another, but what about conflicts? If two Statements want to
|
|
||||||
latch onto the same Context then what? If we wanted to keep the metaphor one
|
|
||||||
would just get randomly chosen over the other, but obviously that's insane.
|
|
||||||
|
|
||||||
############
|
|
||||||
|
|
||||||
I explained some of this to ibrahim already, but I might as well get it all
|
|
||||||
down, cause I've expanded on it a bit since.
|
|
||||||
|
|
||||||
Basically, ops (functions) are fucking everything up. The biggest reason for
|
|
||||||
this is that they are really really hard to implement without a type annotation
|
|
||||||
system. The previous big braindump is about that, but basically I can't figure
|
|
||||||
out a way that feels clean and good enough to be called a "solution" to type
|
|
||||||
inference. I really don't want to have to add type annotations just to support
|
|
||||||
functions, at least not until I explore all of my options.
|
|
||||||
|
|
||||||
The only other option I've come up with so far is the context thing. It's nice
|
|
||||||
because it covers a lot of ground without adding a lot of complexity. Really the
|
|
||||||
biggest problem with it is it doesn't allow for creating new things which look
|
|
||||||
like operations. Instead, everything is done with the %do operator, which feels
|
|
||||||
janky.
|
|
||||||
|
|
||||||
One solution I just thought of is to get rid of the %do operator and simply make
|
|
||||||
it so that a list of Statements can be used as the operator in another
|
|
||||||
Statement. This would _probably_ allow for everything that I want to do. One
|
|
||||||
outstanding problem I'm facing is figuring out if all Statements should take a
|
|
||||||
Context or not.
|
|
||||||
|
|
||||||
* If they did it would be a lot more explicit what's going on. There wouldn't be
|
|
||||||
an ethereal "this context" that would need to be managed and thought about. It
|
|
||||||
would also make things like using a set of Statements as an operator a lot
|
|
||||||
more straightforward, since without Contexts in the Statement it'll be weird
|
|
||||||
to "do" a set of Statements in another Context.
|
|
||||||
|
|
||||||
* On the other hand, it's quite a bit more boilerplate. For the most part most
|
|
||||||
Statements are going to want to be run in "this" context. Also this wouldn't
|
|
||||||
really decrease the number of necessary macros, since one would still be
|
|
||||||
needed in order to retrieve the "root" Context.
|
|
||||||
|
|
||||||
* One option would be for a Statement's Context to be optional. I don't really
|
|
||||||
like this option, it makes a very fundamental datatype (a Statement) a bit
|
|
||||||
fuzzier.
|
|
||||||
|
|
||||||
* Another thing to think about is that I might just rethink how %bind works so
|
|
||||||
that it doesn't operate on an ethereal "this" Context. %ctxbind is one attempt
|
|
||||||
at this, but there's probably other ways.
|
|
||||||
|
|
||||||
* One issue I just thought of with having a set of Statements be used as an
|
|
||||||
operator is that the argument to that Statement becomes.... weird. What even
|
|
||||||
is it? Something the set of Statements can access somehow? Then we still need
|
|
||||||
something like the %in operator.
|
|
||||||
|
|
||||||
Let me backtrack a bit. What's the actual problem? The actual thing I'm
|
|
||||||
struggling with is allowing for code re-use, specifically pure functions. I
|
|
||||||
don't think there's any way anyone could argue that pure functions are not an
|
|
||||||
effective building block in all of programming, so I think I can make that my
|
|
||||||
statement of faith: pure functions are good and worthwhile, impure functions
|
|
||||||
are.... fine.
|
|
||||||
|
|
||||||
Implementing them, however, is quite difficult. Moreso than I thought it would
|
|
||||||
be. The big inhibitor is the method by which I actually pass input data into the
|
|
||||||
function's body. From an implementation standpoint it's difficult because I
|
|
||||||
*need* to know how many bytes on the stack the arguments take up. From a syntax
|
|
||||||
standpoint this is difficult without a type annotation system. And from a
|
|
||||||
usability standpoint this is difficult because it's a task the programmer has to
|
|
||||||
do which doesn't really have to do with the actual purpose or content of the
|
|
||||||
function, it's just a book-keeping exercise.
|
|
||||||
|
|
||||||
So the stack is what's screwing us over here. It's a nice idea, but ultimately
|
|
||||||
makes what we're trying to do difficult. I'm not sure if there's ever going to
|
|
||||||
be a method of implementing pure functions that doesn't involve argument/return
|
|
||||||
value copying though, and therefore which doesn't involve knowing the byte size
|
|
||||||
of your arguments ahead of time.
|
|
||||||
|
|
||||||
It's probably not worth backtracking this much either. For starters, cpus are
|
|
||||||
heavily optimized for stack based operations, and much of the way we currently
|
|
||||||
think about programming is also based on the stack. It would take a lot of
|
|
||||||
backtracking if we ever moved to something else, if there even is anything else
|
|
||||||
worth moving to.
|
|
||||||
|
|
||||||
If that's the case, how is the stack actually used then?
|
|
||||||
|
|
||||||
* There's a stack pointer which points at an address on the stack, the stack
|
|
||||||
being a contiguous range of memory addresses. The place the stack points to is
|
|
||||||
the "top" of the stack, all higher addresses are considered unused (no matter
|
|
||||||
what's in them). All the values in the stack are available to the currently
|
|
||||||
executing code, it simply needs to know either their absolute address or their
|
|
||||||
relative position to the stack pointer.
|
|
||||||
|
|
||||||
* When a function is "called" the arguments to it are copied onto the top of the
|
|
||||||
stack, the stack pointer is increased to reflect the new stack height, and the
|
|
||||||
function's body is jumped to. Inside the body the function need only pop
|
|
||||||
values off the stack as it expects them, as long as it was called properly it
|
|
||||||
doesn't matter how or when the function was called. Once it's done operating
|
|
||||||
the function ensures all the input values have been popped off the stack, and
|
|
||||||
subsequently pushes the return values onto the stack, and jumps back to the
|
|
||||||
caller (the return address was also stored on the stack).
|
|
||||||
|
|
||||||
That's not quite right, but it's close enough for most cases. The more I'm
|
|
||||||
reading about this the more I think it's not going to be worth it to backtrack
|
|
||||||
passed the stack. There's a lot of compiler and machine specific crap that gets
|
|
||||||
involved at that low of a level, and I don't think it's worth getting into it.
|
|
||||||
LLVM did all of that for me, I should learn how to make use of that to make what
|
|
||||||
I want happen.
|
|
||||||
|
|
||||||
But what do I actually want? That's the hard part. I guess I've come full
|
|
||||||
circle. I pretty much *need* to use llvm functions. But I can't do it without
|
|
||||||
declaring the types ahead of time. Ugghh.
|
|
||||||
|
|
||||||
################################
|
|
||||||
|
|
||||||
So here's the current problem:
|
|
||||||
|
|
||||||
I have the concept of a list of statements representing a code block. It's
|
|
||||||
possible/probable that more than this will be needed to represent a code block,
|
|
||||||
but we'll see.
|
|
||||||
|
|
||||||
There's two different ways I think it's logical to use a block:
|
|
||||||
|
|
||||||
* As a way of running statements within a new context which inherits all of its
|
|
||||||
bindings from the parent. This would be used for things like if statements and
|
|
||||||
loops, and behaves the way a code block behaves in most other languages.
|
|
||||||
|
|
||||||
* To define a operator body. An operator's body is effectively the same as the
|
|
||||||
first use-case, except that it has input/output as well. An operator can be
|
|
||||||
bound to an identifier and used in any statement.
|
|
||||||
|
|
||||||
So the hard part, really, is that second point. I have the first done already.
|
|
||||||
The second one isn't too hard to "fake" using our current context system, but it
|
|
||||||
can't be made to be used as an operator in a statement. Here's how to fake it
|
|
||||||
though:
|
|
||||||
|
|
||||||
* Define the list of statements
|
|
||||||
* Make a new context
|
|
||||||
* Bind the "input" bindings into the new context
|
|
||||||
* Run %do with that new context and list of statements
|
|
||||||
* Pull the "output" bindings out of that new context
|
|
||||||
|
|
||||||
And that's it. It's a bit complicated but it ultimately works and effectively
|
|
||||||
inlines a function call.
|
|
||||||
|
|
||||||
It's important that this looks like a normal operator call though, because I
|
|
||||||
believe in guy steele. Here's the current problems I'm having:
|
|
||||||
|
|
||||||
* Defining the input/output values is the big one. In the inline method those
|
|
||||||
were defined implicitly based on what the statements actually use, and the
|
|
||||||
compiler would fail if any were missing or the wrong type. But here we ideally
|
|
||||||
want to define an actual llvm function and not inline everytime. So we need to
|
|
||||||
somehow "know" what the input/output is, and their types.
|
|
||||||
|
|
||||||
* The output value isn't actually *that* difficult. We just look at the
|
|
||||||
output type of the last statement in the list and use that.
|
|
||||||
|
|
||||||
* The input is where it gets tricky. One idea would be to use a statement
|
|
||||||
with no input as the first statement in the list, and that would define
|
|
||||||
the input type. The way macros work this could potentially "just work",
|
|
||||||
but it's tricky.
|
|
||||||
|
|
||||||
* It would also be kind of difficult to make work with operators that take
|
|
||||||
in multiple parameters too. For example, `bind A, 1` would be the normal
|
|
||||||
syntax for binding, but if we want to bind an input value it gets weirder.
|
|
||||||
|
|
||||||
* We could use a "future" kind of syntax, like `bind A, _` or something
|
|
||||||
like that, but that would requre a new expression type and also just
|
|
||||||
be kind of weird.
|
|
||||||
|
|
||||||
* We could have a single macro which always returns the input, like
|
|
||||||
`%in` or something. So the bind would become `bind A, %in` or
|
|
||||||
`bind (A, B), %in` if we ever get destructuring. This isn't a terrible
|
|
||||||
solution, though a bit unfortunate in that it could get confusing with
|
|
||||||
different operators all using the same input variable effectively. It
|
|
||||||
also might be a bit difficult to implement, since it kind of forces us
|
|
||||||
to only have a single argument to the LLVM function? Hard to say how
|
|
||||||
that would work. Possibly all llvm functions could be made to take in
|
|
||||||
a struct, but that would be ghetto af. Not doing a struct would take a
|
|
||||||
special interaction though.... It might not be possible to do this
|
|
||||||
without a struct =/
|
|
||||||
|
|
||||||
* Somehow allowing to define the context which gets used on each call to the
|
|
||||||
operator, instead of always using a blank one, would be nice.
|
|
||||||
|
|
||||||
* The big part of this problem is actually the syntax for calling the
|
|
||||||
operator. It's pretty easy to have this handled within the operator by the
|
|
||||||
%thisctx macro. But we want the operator to be callable by the same syntax
|
|
||||||
as all other operator calls, and currently that doesn't have any way of
|
|
||||||
passing in a new context.
|
|
||||||
|
|
||||||
* Additionally, if we're implementing the operator as an LLVM function then
|
|
||||||
there's not really any way to pass in that context to it without making
|
|
||||||
those variables global or something, which is shitty.
|
|
||||||
|
|
||||||
* So writing all this out it really feels like I'm dealing with two separate
|
|
||||||
types that just happen to look similar:
|
|
||||||
|
|
||||||
* Block: a list of statements which run with a variable context.
|
|
||||||
|
|
||||||
* Operator: a list of statements which run with a fixed (empty?) context,
|
|
||||||
and have input/output.
|
|
||||||
|
|
||||||
* There's so very nearly a symmetry there. Things that are inconsistent:
|
|
||||||
|
|
||||||
* A block doesn't have input/output
|
|
||||||
|
|
||||||
* It sort of does, in the form of the context it's being run with and
|
|
||||||
%ctxget, but not an explicit input/output like the operator has.
|
|
||||||
|
|
||||||
* If this could be reconciled I think this whole shitshow could be made
|
|
||||||
to have some consistency.
|
|
||||||
|
|
||||||
* Using %in this pretty much "just works". But it's still weird. Really
|
|
||||||
we'd want to turn the block into a one-off operator everytime we use
|
|
||||||
it. This is possible.
|
|
||||||
|
|
||||||
* An operator's context must be empty
|
|
||||||
|
|
||||||
* It doesn't *have* to be, defining the ctx which goes with the operator
|
|
||||||
could be part of however an operator is created.
|
|
||||||
|
|
||||||
* So after all of that, I think operators and blocks are kind of the same.
|
|
||||||
|
|
||||||
* They both use %in to take in input, and both output using the last statement
|
|
||||||
in their list of statements.
|
|
||||||
|
|
||||||
* They both have a context bound to them, operators are fixed but a block
|
|
||||||
changes.
|
|
||||||
|
|
||||||
* An operator is a block with a bound context.
|
|
||||||
|
|
||||||
##############@@@@@@@@@#$%^&^%$#@#$%^&*
|
|
||||||
|
|
||||||
* New problem: type inference. LLVM requires that a function's definition have
|
|
||||||
the type specified up-front. This kind of blows. Well actually, it blows a lot
|
|
||||||
more than kind of. There's two things that need to be infered from a List of
|
|
||||||
Statements then: the input type and the output type. There's two approaches
|
|
||||||
I've thought of in the current setup.
|
|
||||||
|
|
||||||
* There's two approaches to determining the type of an operator: analyze the
|
|
||||||
code as ginger expressions, or build the actual llvm structures and
|
|
||||||
analyze those.
|
|
||||||
|
|
||||||
* Looking at the ginger expressions is definitely somewhat fuzzy. We can
|
|
||||||
look at all the statements and sub-statements until we find an
|
|
||||||
instance of %in, then look at what that's in input into. But if it's
|
|
||||||
simply binding into an Identifier then we have to find the identifier.
|
|
||||||
If it's destructuring then that gets even *more* complicated.
|
|
||||||
|
|
||||||
* Destructuring is what really makes this approach difficult.
|
|
||||||
Presumably there's going to be a function that takes in an
|
|
||||||
Identifier (or %in I guess?) and a set of Statements and returns
|
|
||||||
the type for that Identifier. If we find that %in is destructured
|
|
||||||
into a tuple then we would run that function for each constituent
|
|
||||||
Identifier and put it all together. But then this inference
|
|
||||||
function is really coupled to %bind, which kind of blows. Also we
|
|
||||||
may one day want to support destructuring into non-tuples as well,
|
|
||||||
which would make this even harder.
|
|
||||||
|
|
||||||
* We could make it the job of the macro definition to know its input
|
|
||||||
and output types, as well as the types of any bindings it makes.
|
|
||||||
That places some burden on user macros in the future, but then
|
|
||||||
maybe it can be inferred for user macros? That's a lot of hope. It
|
|
||||||
would also mean the macro would need the full set of statements
|
|
||||||
that will ever run in the same Context as it, so it can determine
|
|
||||||
the types of any bindings it makes.
|
|
||||||
|
|
||||||
* The second method is to build the statements into LLVM structures and
|
|
||||||
then look at those structures. This has the benefit of being
|
|
||||||
non-ambiguous once we actually find the answer. LLVM is super strongly
|
|
||||||
typed, and re-iterates the types involved for every operation. So if
|
|
||||||
the llvm builder builds it then we need only look for the first usage
|
|
||||||
of every argument/return and we'll know the types involved.
|
|
||||||
|
|
||||||
* This requires us to use structs for tuples, and not actually use
|
|
||||||
multiple arguments. Otherwise it won't be possible to know the
|
|
||||||
difference between a 3 argument function and a 4 argument one
|
|
||||||
which doesn't use its 4th argument (which shouldn't really happen,
|
|
||||||
but could).
|
|
||||||
|
|
||||||
* The main hinderence is that the llvm builder is really not
|
|
||||||
designed for this sort of thing. We could conceivably create a
|
|
||||||
"dummy" function with bogus types and write the body, analyze the
|
|
||||||
body, erase the function, and start over with a non-dummy
|
|
||||||
function. But it's the "analyze the body" step that's difficult.
|
|
||||||
It's difficult to find the types of things without the llvm.Value
|
|
||||||
objects in hand, but since building is set up as a recursive
|
|
||||||
process that becomes non-trivial. This really feels like the way
|
|
||||||
to go though, I think it's actually doable.
|
|
||||||
|
|
||||||
* This could be something we tack onto llvmVal, and then make
|
|
||||||
Build return extra data about what types the Statements it
|
|
||||||
handled input and output.
|
|
||||||
|
|
||||||
* For other setups that would enable this a bit better, the one that keeps
|
|
||||||
coming to mind is a more pipeline style system. Things like %bind would need
|
|
||||||
to be refactored from something that takes a Tuple to something that only
|
|
||||||
takes an Identifier and returns a macro which will bind to that Identifier.
|
|
||||||
This doesn't *really* solve the type problem I guess, since whatever is input
|
|
||||||
into the Identifier's bind doesn't necessarily have a type attached to it.
|
|
||||||
Sooo yeah nvm.
|
|
127
README.md
127
README.md
@ -1,118 +1,29 @@
|
|||||||
# Ginger - holy fuck again?
|
# Ginger
|
||||||
|
|
||||||
## The final result. A language which can do X
|
A programming language utilizing a graph datastructure for syntax. Currently in
|
||||||
|
super-early-alpha-don't-actually-use-this-for-anything development.
|
||||||
|
|
||||||
- Support my OS
|
## Development
|
||||||
- Compile on many architectures
|
|
||||||
- Be low level and fast (effectively c-level)
|
|
||||||
- Be well defined, using a simple syntax
|
|
||||||
- Extensible based on which section of the OS I'm working on
|
|
||||||
- Good error messages
|
|
||||||
|
|
||||||
- Support other programmers and other programming areas
|
Current efforts on ginger are focused on a golang-based virtual machine, which
|
||||||
- Effectively means able to be used in most purposes
|
will then be used to bootstrap the language.
|
||||||
- Able to be quickly learned
|
|
||||||
- Able to be shared
|
|
||||||
- Free
|
|
||||||
- New or improved components shared between computers/platforms/people
|
|
||||||
|
|
||||||
- Support itself
|
If you are on a machine with nix installed, you can run:
|
||||||
- Garner a team to work on the compiler
|
|
||||||
- Team must not require my help for day-to-day
|
|
||||||
- Team must stick to the manifesto, either through the design or through
|
|
||||||
trust
|
|
||||||
|
|
||||||
## The language: A manifesto, defines the concept of the language
|
```
|
||||||
|
nix develop
|
||||||
|
```
|
||||||
|
|
||||||
- Quips
|
from the repo root and you will be dropped into a shell with all dependencies
|
||||||
- Easier is not better
|
(including the correct go version) in your PATH, ready to use.
|
||||||
|
|
||||||
- Data as the language
|
## Demo
|
||||||
- Differentiation between "syntax" and "language", parser vs compiler
|
|
||||||
- Syntax defines the form which is parsed
|
|
||||||
- The parser reads the syntax forms into data structures
|
|
||||||
- Language defines how the syntax is read into data structures and
|
|
||||||
"understood" (i.e. and what is done with those structures).
|
|
||||||
- A language maybe have multiple syntaxes, if they all parse into
|
|
||||||
the same underlying data structures they can be understood in the
|
|
||||||
same way.
|
|
||||||
- A compiler turns the parsed language into machine code. An
|
|
||||||
interpreter performs actions directly based off of the parsed
|
|
||||||
language.
|
|
||||||
|
|
||||||
- Types, instances, and operations
|
An example program which computes the Nth fibonacci number can be found at
|
||||||
- A language has a set of elemental types, and composite types
|
`examples/fib.gg`. You can try it out by doing:
|
||||||
- "The type defines the [fundamental] operations that can be done on the
|
|
||||||
data, the meaning of the data, and the way values of that type can be
|
|
||||||
stored"
|
|
||||||
- Elemental types are all forms of numbers, since numbers are all a
|
|
||||||
computer really knows
|
|
||||||
- Composite types take two forms:
|
|
||||||
- Homogeneous: all composed values are the same type (arrays)
|
|
||||||
- Heterogeneous: all composed values are different
|
|
||||||
- If known size and known types per-index, tuples
|
|
||||||
- A 0-tuple is kind of special, and simply indicates absence of
|
|
||||||
any value.
|
|
||||||
- A third type, Any, indicates that the type is unknown at compile-time.
|
|
||||||
Type information must be passed around with it at runtime.
|
|
||||||
- An operation has an input and output. It does some action on the input
|
|
||||||
to produce the output (presumably). An operation may be performed as
|
|
||||||
many times as needed, given any value of the input type. The types of
|
|
||||||
both the input and output are constant, and together they form the
|
|
||||||
operation's type.
|
|
||||||
- A value is an instance of a type, where the type is known at compile-time
|
|
||||||
(though the type may be Any). Multiple values may be instances of the same
|
|
||||||
type. E.g.: 1 and 2 are both instances of int
|
|
||||||
- A value is immutable
|
|
||||||
- TODO value is a weird word, since an instance of a type has both a
|
|
||||||
type and value. I need to think about this more. Instance might be a
|
|
||||||
better name
|
|
||||||
|
|
||||||
- Stack and scope
|
```
|
||||||
- A function call operates within a scope. The scope had arguments passed
|
go run ./cmd/eval/main.go "$(cat examples/fib.gg)" 5
|
||||||
into it.
|
```
|
||||||
- When a function calls another, that other's scope is said to be "inside"
|
|
||||||
the caller's scope.
|
|
||||||
- A pure function only operates on the arguments passed into it.
|
|
||||||
- A pointer allows for modification outside of the current scope, but only a
|
|
||||||
pointer into an outer scope. A function which does this is "impure"
|
|
||||||
|
|
||||||
- Built-in
|
Where you can replace `5` with any number.
|
||||||
- Elementals
|
|
||||||
- ints (n-bit)
|
|
||||||
- tuples
|
|
||||||
- stack arrays
|
|
||||||
- indexable
|
|
||||||
- head/tail
|
|
||||||
- reversible (?)
|
|
||||||
- appendable
|
|
||||||
- functions (?)
|
|
||||||
- pointers (?)
|
|
||||||
- Any (?)
|
|
||||||
- Elementals must be enough to define the type of a variable
|
|
||||||
- Ability to create and modify elmental types
|
|
||||||
- immutable, pure functions
|
|
||||||
- Other builtin functionality:
|
|
||||||
- Load/call linked libraries
|
|
||||||
- Comiletime macros
|
|
||||||
- Red/Blue
|
|
||||||
|
|
||||||
- Questions
|
|
||||||
- Strings need to be defined in terms of the built-in types, which would be
|
|
||||||
an array of lists. But this means I'm married to that definition of a
|
|
||||||
string, it'd be difficult for anyone to define their own and have it
|
|
||||||
interop. Unless "int" was some kind of macro type that did some fancy
|
|
||||||
shit, but that's kind of gross.
|
|
||||||
- An idea of the "equality" of two variables being tied not just to their
|
|
||||||
value but to the context in which they were created. Would aid in things
|
|
||||||
like compiler tagging.
|
|
||||||
- There's a "requirement loop" of things which need figuring out:
|
|
||||||
- function structure
|
|
||||||
- types
|
|
||||||
- seq type
|
|
||||||
- stack/scope
|
|
||||||
- Most likely I'm going to need some kind of elemental type to indicate
|
|
||||||
something should happen at compile-time and not runtime, or the other way
|
|
||||||
around.
|
|
||||||
|
|
||||||
## The roadmap: A plan of action for tackling the language
|
|
||||||
|
37
cmd/eval/main.go
Normal file
37
cmd/eval/main.go
Normal file
@ -0,0 +1,37 @@
|
|||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"fmt"
|
||||||
|
"os"
|
||||||
|
|
||||||
|
"code.betamike.com/mediocregopher/ginger/gg"
|
||||||
|
"code.betamike.com/mediocregopher/ginger/vm"
|
||||||
|
)
|
||||||
|
|
||||||
|
func main() {
|
||||||
|
|
||||||
|
if len(os.Args) < 3 {
|
||||||
|
fmt.Printf(`Usage: %s <operation source> "in = <value>"\n`, os.Args[0])
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
opSrc := os.Args[1]
|
||||||
|
inSrc := os.Args[2]
|
||||||
|
|
||||||
|
inVal, err := gg.NewDecoder(bytes.NewBufferString(inSrc)).Next()
|
||||||
|
if err != nil {
|
||||||
|
panic(fmt.Sprintf("decoding input: %v", err))
|
||||||
|
}
|
||||||
|
|
||||||
|
res, err := vm.EvaluateSource(
|
||||||
|
bytes.NewBufferString(opSrc),
|
||||||
|
vm.Value{Value: inVal.Value},
|
||||||
|
vm.GlobalScope,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
panic(fmt.Sprintf("evaluating: %v", err))
|
||||||
|
}
|
||||||
|
|
||||||
|
fmt.Println(res)
|
||||||
|
}
|
53
examples/examples_test.go
Normal file
53
examples/examples_test.go
Normal file
@ -0,0 +1,53 @@
|
|||||||
|
package examples_test
|
||||||
|
|
||||||
|
import (
|
||||||
|
"embed"
|
||||||
|
"fmt"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"code.betamike.com/mediocregopher/ginger/gg"
|
||||||
|
"code.betamike.com/mediocregopher/ginger/vm"
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
//go:embed *.gg
|
||||||
|
var examplesFS embed.FS
|
||||||
|
|
||||||
|
func TestAllExamples(t *testing.T) {
|
||||||
|
|
||||||
|
tests := []struct {
|
||||||
|
path string
|
||||||
|
in vm.Value
|
||||||
|
exp vm.Value
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
path: "fib.gg",
|
||||||
|
in: vm.Value{Value: gg.Number(5)},
|
||||||
|
exp: vm.Value{Value: gg.Number(5)},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
path: "fib.gg",
|
||||||
|
in: vm.Value{Value: gg.Number(10)},
|
||||||
|
exp: vm.Value{Value: gg.Number(55)},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
path: "fib.gg",
|
||||||
|
in: vm.Value{Value: gg.Number(69)},
|
||||||
|
exp: vm.Value{Value: gg.Number(117669030460994)},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, test := range tests {
|
||||||
|
t.Run(fmt.Sprintf("%s_%v", test.path, test.in), func(t *testing.T) {
|
||||||
|
f, err := examplesFS.Open(test.path)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatal(err)
|
||||||
|
}
|
||||||
|
defer f.Close()
|
||||||
|
|
||||||
|
got, err := vm.EvaluateSource(f, test.in, vm.GlobalScope)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.True(t, test.exp.Equal(got), "%v != %v", test.exp, got)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
24
examples/fib.gg
Normal file
24
examples/fib.gg
Normal file
@ -0,0 +1,24 @@
|
|||||||
|
* A function which accepts a number N and returns the Nth fibonacci number
|
||||||
|
{
|
||||||
|
* We are passing a tuple of inputs into a graph here, such that the graph is
|
||||||
|
* evaluated as an anonymous function. That anonymous function uses !recur
|
||||||
|
* internally to compute the result.
|
||||||
|
!out = {
|
||||||
|
|
||||||
|
* A little helper function.
|
||||||
|
decr = { !out = !add < (!in, -1) };
|
||||||
|
|
||||||
|
* Deconstruct the input tuple into its individual elements, for clarity.
|
||||||
|
* There will be a more ergonomic way of doing this one day.
|
||||||
|
n = !tupEl < (!in, 0);
|
||||||
|
a = !tupEl < (!in, 1);
|
||||||
|
b = !tupEl < (!in, 2);
|
||||||
|
|
||||||
|
!out = !if < (
|
||||||
|
!isZero < n,
|
||||||
|
a,
|
||||||
|
!recur < ( decr<n, b, !add<(a,b) ),
|
||||||
|
);
|
||||||
|
|
||||||
|
} < (!in, 0, 1);
|
||||||
|
}
|
219
expr/build.go
219
expr/build.go
@ -1,219 +0,0 @@
|
|||||||
package expr
|
|
||||||
|
|
||||||
import (
|
|
||||||
"fmt"
|
|
||||||
"log"
|
|
||||||
|
|
||||||
"llvm.org/llvm/bindings/go/llvm"
|
|
||||||
)
|
|
||||||
|
|
||||||
func init() {
|
|
||||||
log.Printf("initializing llvm")
|
|
||||||
llvm.LinkInMCJIT()
|
|
||||||
llvm.InitializeNativeTarget()
|
|
||||||
llvm.InitializeNativeAsmPrinter()
|
|
||||||
}
|
|
||||||
|
|
||||||
type BuildCtx struct {
|
|
||||||
B llvm.Builder
|
|
||||||
M llvm.Module
|
|
||||||
}
|
|
||||||
|
|
||||||
func NewBuildCtx(moduleName string) BuildCtx {
|
|
||||||
return BuildCtx{
|
|
||||||
B: llvm.NewBuilder(),
|
|
||||||
M: llvm.NewModule(moduleName),
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func (bctx BuildCtx) Build(ctx Ctx, stmts ...Statement) llvm.Value {
|
|
||||||
var lastVal llvm.Value
|
|
||||||
for _, stmt := range stmts {
|
|
||||||
if e := bctx.BuildStmt(ctx, stmt); e != nil {
|
|
||||||
if lv, ok := e.(llvmVal); ok {
|
|
||||||
lastVal = llvm.Value(lv)
|
|
||||||
} else {
|
|
||||||
log.Printf("BuildStmt returned non llvmVal from %v: %v (%T)", stmt, e, e)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (lastVal == llvm.Value{}) {
|
|
||||||
lastVal = bctx.B.CreateRetVoid()
|
|
||||||
}
|
|
||||||
return lastVal
|
|
||||||
}
|
|
||||||
|
|
||||||
func (bctx BuildCtx) BuildStmt(ctx Ctx, s Statement) Expr {
|
|
||||||
log.Printf("building: %v", s)
|
|
||||||
switch o := s.Op.(type) {
|
|
||||||
case Macro:
|
|
||||||
return ctx.Macro(o)(bctx, ctx, s.Arg)
|
|
||||||
case Identifier:
|
|
||||||
s2 := s
|
|
||||||
s2.Op = ctx.Identifier(o).(llvmVal)
|
|
||||||
return bctx.BuildStmt(ctx, s2)
|
|
||||||
case Statement:
|
|
||||||
s2 := s
|
|
||||||
s2.Op = bctx.BuildStmt(ctx, o)
|
|
||||||
return bctx.BuildStmt(ctx, s2)
|
|
||||||
case llvmVal:
|
|
||||||
arg := bctx.buildExpr(ctx, s.Arg).(llvmVal)
|
|
||||||
out := bctx.B.CreateCall(llvm.Value(o), []llvm.Value{llvm.Value(arg)}, "")
|
|
||||||
return llvmVal(out)
|
|
||||||
default:
|
|
||||||
panic(fmt.Sprintf("non op type %v (%T)", s.Op, s.Op))
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// may return nil if e is a Statement which has no return
|
|
||||||
func (bctx BuildCtx) buildExpr(ctx Ctx, e Expr) Expr {
|
|
||||||
return bctx.buildExprTill(ctx, e, func(Expr) bool { return false })
|
|
||||||
}
|
|
||||||
|
|
||||||
// like buildExpr, but will stop short and stop recursing when the function
|
|
||||||
// returns true
|
|
||||||
func (bctx BuildCtx) buildExprTill(ctx Ctx, e Expr, fn func(e Expr) bool) Expr {
|
|
||||||
if fn(e) {
|
|
||||||
return e
|
|
||||||
}
|
|
||||||
|
|
||||||
switch ea := e.(type) {
|
|
||||||
case llvmVal:
|
|
||||||
return e
|
|
||||||
case Int:
|
|
||||||
return llvmVal(llvm.ConstInt(llvm.Int64Type(), uint64(ea), false))
|
|
||||||
case Identifier:
|
|
||||||
return ctx.Identifier(ea)
|
|
||||||
case Statement:
|
|
||||||
return bctx.BuildStmt(ctx, ea)
|
|
||||||
case Tuple:
|
|
||||||
// if the tuple is empty then it is a void
|
|
||||||
if len(ea) == 0 {
|
|
||||||
return llvmVal(llvm.Undef(llvm.VoidType()))
|
|
||||||
}
|
|
||||||
|
|
||||||
ea2 := make(Tuple, len(ea))
|
|
||||||
for i := range ea {
|
|
||||||
ea2[i] = bctx.buildExprTill(ctx, ea[i], fn)
|
|
||||||
}
|
|
||||||
|
|
||||||
// if the fields of the tuple are all llvmVal then we can make a proper
|
|
||||||
// struct
|
|
||||||
vals := make([]llvm.Value, len(ea2))
|
|
||||||
typs := make([]llvm.Type, len(ea2))
|
|
||||||
for i := range ea2 {
|
|
||||||
if v, ok := ea2[i].(llvmVal); ok {
|
|
||||||
val := llvm.Value(v)
|
|
||||||
vals[i] = val
|
|
||||||
typs[i] = val.Type()
|
|
||||||
} else {
|
|
||||||
return ea2
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
str := llvm.Undef(llvm.StructType(typs, false))
|
|
||||||
for i := range vals {
|
|
||||||
str = bctx.B.CreateInsertValue(str, vals[i], i, "")
|
|
||||||
}
|
|
||||||
return llvmVal(str)
|
|
||||||
case List:
|
|
||||||
ea2 := make(Tuple, len(ea))
|
|
||||||
for i := range ea {
|
|
||||||
ea2[i] = bctx.buildExprTill(ctx, ea[i], fn)
|
|
||||||
}
|
|
||||||
return ea2
|
|
||||||
case Ctx:
|
|
||||||
return ea
|
|
||||||
default:
|
|
||||||
panicf("%v (type %T) can't express a value", ea, ea)
|
|
||||||
}
|
|
||||||
panic("go is dumb")
|
|
||||||
}
|
|
||||||
|
|
||||||
func (bctx BuildCtx) buildVal(ctx Ctx, e Expr) llvm.Value {
|
|
||||||
return llvm.Value(bctx.buildExpr(ctx, e).(llvmVal))
|
|
||||||
}
|
|
||||||
|
|
||||||
// globalCtx describes what's available to *all* contexts, and is what all
|
|
||||||
// contexts should have as the root parent in the tree.
|
|
||||||
//
|
|
||||||
// We define in this weird way cause NewCtx actually references globalCtx
|
|
||||||
var globalCtx *Ctx
|
|
||||||
var _ = func() bool {
|
|
||||||
globalCtx = &Ctx{
|
|
||||||
macros: map[Macro]MacroFn{
|
|
||||||
"add": func(bctx BuildCtx, ctx Ctx, e Expr) Expr {
|
|
||||||
tup := bctx.buildExpr(ctx, e).(llvmVal)
|
|
||||||
a := bctx.B.CreateExtractValue(llvm.Value(tup), 0, "")
|
|
||||||
b := bctx.B.CreateExtractValue(llvm.Value(tup), 1, "")
|
|
||||||
return llvmVal(bctx.B.CreateAdd(a, b, ""))
|
|
||||||
},
|
|
||||||
|
|
||||||
// TODO this chould be a user macro!!!! WUT this language is baller
|
|
||||||
"bind": func(bctx BuildCtx, ctx Ctx, e Expr) Expr {
|
|
||||||
tup := bctx.buildExprTill(ctx, e, isIdentifier).(Tuple)
|
|
||||||
id := bctx.buildExprTill(ctx, tup[0], isIdentifier).(Identifier)
|
|
||||||
val := bctx.buildExpr(ctx, tup[1])
|
|
||||||
ctx.Bind(id, val)
|
|
||||||
return NewTuple()
|
|
||||||
},
|
|
||||||
|
|
||||||
"ctxnew": func(bctx BuildCtx, ctx Ctx, e Expr) Expr {
|
|
||||||
return NewCtx()
|
|
||||||
},
|
|
||||||
|
|
||||||
"ctxthis": func(bctx BuildCtx, ctx Ctx, e Expr) Expr {
|
|
||||||
return ctx
|
|
||||||
},
|
|
||||||
|
|
||||||
"ctxbind": func(bctx BuildCtx, ctx Ctx, e Expr) Expr {
|
|
||||||
tup := bctx.buildExprTill(ctx, e, isIdentifier).(Tuple)
|
|
||||||
thisCtx := bctx.buildExpr(ctx, tup[0]).(Ctx)
|
|
||||||
id := bctx.buildExprTill(ctx, tup[1], isIdentifier).(Identifier)
|
|
||||||
thisCtx.Bind(id, bctx.buildExpr(ctx, tup[2]))
|
|
||||||
return NewTuple()
|
|
||||||
},
|
|
||||||
|
|
||||||
"ctxget": func(bctx BuildCtx, ctx Ctx, e Expr) Expr {
|
|
||||||
tup := bctx.buildExprTill(ctx, e, isIdentifier).(Tuple)
|
|
||||||
thisCtx := bctx.buildExpr(ctx, tup[0]).(Ctx)
|
|
||||||
id := bctx.buildExprTill(ctx, tup[1], isIdentifier).(Identifier)
|
|
||||||
return thisCtx.Identifier(id)
|
|
||||||
},
|
|
||||||
|
|
||||||
"do": func(bctx BuildCtx, ctx Ctx, e Expr) Expr {
|
|
||||||
tup := bctx.buildExprTill(ctx, e, isStmt).(Tuple)
|
|
||||||
thisCtx := tup[0].(Ctx)
|
|
||||||
for _, stmtE := range tup[1].(List) {
|
|
||||||
bctx.BuildStmt(thisCtx, stmtE.(Statement))
|
|
||||||
}
|
|
||||||
return NewTuple()
|
|
||||||
},
|
|
||||||
|
|
||||||
"op": func(bctx BuildCtx, ctx Ctx, e Expr) Expr {
|
|
||||||
l := bctx.buildExprTill(ctx, e, isList).(List)
|
|
||||||
stmts := make([]Statement, len(l))
|
|
||||||
for i := range l {
|
|
||||||
stmts[i] = l[i].(Statement)
|
|
||||||
}
|
|
||||||
|
|
||||||
// TODO obviously this needs to be fixed
|
|
||||||
fn := llvm.AddFunction(bctx.M, "", llvm.FunctionType(llvm.Int64Type(), []llvm.Type{llvm.Int64Type()}, false))
|
|
||||||
fnbl := llvm.AddBasicBlock(fn, "")
|
|
||||||
|
|
||||||
prevbl := bctx.B.GetInsertBlock()
|
|
||||||
bctx.B.SetInsertPoint(fnbl, fnbl.FirstInstruction())
|
|
||||||
out := bctx.Build(NewCtx(), stmts...)
|
|
||||||
bctx.B.CreateRet(out)
|
|
||||||
bctx.B.SetInsertPointAtEnd(prevbl)
|
|
||||||
return llvmVal(fn)
|
|
||||||
},
|
|
||||||
|
|
||||||
"in": func(bctx BuildCtx, ctx Ctx, e Expr) Expr {
|
|
||||||
fn := bctx.B.GetInsertBlock().Parent()
|
|
||||||
return llvmVal(fn.Param(0))
|
|
||||||
},
|
|
||||||
},
|
|
||||||
}
|
|
||||||
return false
|
|
||||||
}()
|
|
@ -1,99 +0,0 @@
|
|||||||
package expr
|
|
||||||
|
|
||||||
import (
|
|
||||||
"fmt"
|
|
||||||
. "testing"
|
|
||||||
|
|
||||||
"llvm.org/llvm/bindings/go/llvm"
|
|
||||||
)
|
|
||||||
|
|
||||||
func buildTest(t *T, expected int64, stmts ...Statement) {
|
|
||||||
fmt.Println("-----------------------------------------")
|
|
||||||
ctx := NewCtx()
|
|
||||||
bctx := NewBuildCtx("")
|
|
||||||
|
|
||||||
fn := llvm.AddFunction(bctx.M, "", llvm.FunctionType(llvm.Int64Type(), []llvm.Type{}, false))
|
|
||||||
fnbl := llvm.AddBasicBlock(fn, "")
|
|
||||||
bctx.B.SetInsertPoint(fnbl, fnbl.FirstInstruction())
|
|
||||||
out := bctx.Build(ctx, stmts...)
|
|
||||||
bctx.B.CreateRet(out)
|
|
||||||
|
|
||||||
fmt.Println("######## dumping IR")
|
|
||||||
bctx.M.Dump()
|
|
||||||
fmt.Println("######## done dumping IR")
|
|
||||||
|
|
||||||
if err := llvm.VerifyModule(bctx.M, llvm.ReturnStatusAction); err != nil {
|
|
||||||
t.Fatal(err)
|
|
||||||
}
|
|
||||||
|
|
||||||
eng, err := llvm.NewExecutionEngine(bctx.M)
|
|
||||||
if err != nil {
|
|
||||||
t.Fatal(err)
|
|
||||||
}
|
|
||||||
|
|
||||||
res := eng.RunFunction(fn, []llvm.GenericValue{}).Int(false)
|
|
||||||
if int64(res) != expected {
|
|
||||||
t.Errorf("expected:[%T]%v actual:[%T]%v", expected, expected, res, res)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func TestAdd(t *T) {
|
|
||||||
buildTest(t, 2,
|
|
||||||
NewStatement(Macro("add"), Int(1), Int(1)))
|
|
||||||
buildTest(t, 4,
|
|
||||||
NewStatement(Macro("add"), Int(1),
|
|
||||||
NewStatement(Macro("add"), Int(1), Int(2))))
|
|
||||||
buildTest(t, 6,
|
|
||||||
NewStatement(Macro("add"),
|
|
||||||
NewStatement(Macro("add"), Int(1), Int(2)),
|
|
||||||
NewStatement(Macro("add"), Int(1), Int(2))))
|
|
||||||
}
|
|
||||||
|
|
||||||
func TestBind(t *T) {
|
|
||||||
buildTest(t, 2,
|
|
||||||
NewStatement(Macro("bind"), Identifier("A"), Int(1)),
|
|
||||||
NewStatement(Macro("add"), Identifier("A"), Int(1)))
|
|
||||||
buildTest(t, 2,
|
|
||||||
NewStatement(Macro("bind"), Identifier("A"), Int(1)),
|
|
||||||
NewStatement(Macro("add"), Identifier("A"), Identifier("A")))
|
|
||||||
buildTest(t, 2,
|
|
||||||
NewStatement(Macro("bind"), Identifier("A"), NewTuple(Int(1), Int(1))),
|
|
||||||
NewStatement(Macro("add"), Identifier("A")))
|
|
||||||
buildTest(t, 3,
|
|
||||||
NewStatement(Macro("bind"), Identifier("A"), NewTuple(Int(1), Int(1))),
|
|
||||||
NewStatement(Macro("add"), Int(1),
|
|
||||||
NewStatement(Macro("add"), Identifier("A"))))
|
|
||||||
buildTest(t, 4,
|
|
||||||
NewStatement(Macro("bind"), Identifier("A"), NewTuple(Int(1), Int(1))),
|
|
||||||
NewStatement(Macro("add"),
|
|
||||||
NewStatement(Macro("add"), Identifier("A")),
|
|
||||||
NewStatement(Macro("add"), Identifier("A"))))
|
|
||||||
}
|
|
||||||
|
|
||||||
func TestOp(t *T) {
|
|
||||||
incr := NewStatement(Macro("op"),
|
|
||||||
NewList(
|
|
||||||
NewStatement(Macro("add"), Int(1), NewStatement(Macro("in"))),
|
|
||||||
),
|
|
||||||
)
|
|
||||||
|
|
||||||
// bound op
|
|
||||||
buildTest(t, 2,
|
|
||||||
NewStatement(Macro("bind"), Identifier("incr"), incr),
|
|
||||||
NewStatement(Identifier("incr"), Int(1)))
|
|
||||||
|
|
||||||
// double bound op
|
|
||||||
buildTest(t, 3,
|
|
||||||
NewStatement(Macro("bind"), Identifier("incr"), incr),
|
|
||||||
NewStatement(Identifier("incr"),
|
|
||||||
NewStatement(Identifier("incr"), Int(1))))
|
|
||||||
|
|
||||||
// anon op
|
|
||||||
buildTest(t, 2,
|
|
||||||
NewStatement(incr, Int(1)))
|
|
||||||
|
|
||||||
// double anon op
|
|
||||||
buildTest(t, 3,
|
|
||||||
NewStatement(incr,
|
|
||||||
NewStatement(incr, Int(1))))
|
|
||||||
}
|
|
72
expr/ctx.go
72
expr/ctx.go
@ -1,72 +0,0 @@
|
|||||||
package expr
|
|
||||||
|
|
||||||
// MacroFn is a compiler function which takes in an existing Expr and returns
|
|
||||||
// the llvm Value for it
|
|
||||||
type MacroFn func(BuildCtx, Ctx, Expr) Expr
|
|
||||||
|
|
||||||
// Ctx contains all the Macros and Identifiers available. A Ctx also keeps a
|
|
||||||
// reference to the global context, which has a number of macros available for
|
|
||||||
// all contexts to use.
|
|
||||||
type Ctx struct {
|
|
||||||
global *Ctx
|
|
||||||
macros map[Macro]MacroFn
|
|
||||||
idents map[Identifier]Expr
|
|
||||||
}
|
|
||||||
|
|
||||||
// NewCtx returns a blank context instance
|
|
||||||
func NewCtx() Ctx {
|
|
||||||
return Ctx{
|
|
||||||
global: globalCtx,
|
|
||||||
macros: map[Macro]MacroFn{},
|
|
||||||
idents: map[Identifier]Expr{},
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Macro returns the MacroFn associated with the given identifier, or panics
|
|
||||||
// if the macro isn't found
|
|
||||||
func (c Ctx) Macro(m Macro) MacroFn {
|
|
||||||
if fn := c.macros[m]; fn != nil {
|
|
||||||
return fn
|
|
||||||
}
|
|
||||||
if fn := c.global.macros[m]; fn != nil {
|
|
||||||
return fn
|
|
||||||
}
|
|
||||||
panicf("macro %q not found in context", m)
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// Identifier returns the llvm.Value for the Identifier, or panics
|
|
||||||
func (c Ctx) Identifier(i Identifier) Expr {
|
|
||||||
if e := c.idents[i]; e != nil {
|
|
||||||
return e
|
|
||||||
}
|
|
||||||
// The global context doesn't have any identifiers, so don't bother checking
|
|
||||||
panicf("identifier %q not found", i)
|
|
||||||
panic("go is dumb")
|
|
||||||
}
|
|
||||||
|
|
||||||
// Copy returns a deep copy of the Ctx
|
|
||||||
func (c Ctx) Copy() Ctx {
|
|
||||||
cc := Ctx{
|
|
||||||
global: c.global,
|
|
||||||
macros: make(map[Macro]MacroFn, len(c.macros)),
|
|
||||||
idents: make(map[Identifier]Expr, len(c.idents)),
|
|
||||||
}
|
|
||||||
for m, mfn := range c.macros {
|
|
||||||
cc.macros[m] = mfn
|
|
||||||
}
|
|
||||||
for i, e := range c.idents {
|
|
||||||
cc.idents[i] = e
|
|
||||||
}
|
|
||||||
return cc
|
|
||||||
}
|
|
||||||
|
|
||||||
// Bind returns a new Ctx which is a copy of this one, but with the given
|
|
||||||
// Identifier bound to the given Expr. Will panic if the Identifier is already
|
|
||||||
// bound
|
|
||||||
func (c Ctx) Bind(i Identifier, e Expr) {
|
|
||||||
if _, ok := c.idents[i]; ok {
|
|
||||||
panicf("identifier %q is already bound", i)
|
|
||||||
}
|
|
||||||
c.idents[i] = e
|
|
||||||
}
|
|
210
expr/expr.go
210
expr/expr.go
@ -1,210 +0,0 @@
|
|||||||
package expr
|
|
||||||
|
|
||||||
import (
|
|
||||||
"fmt"
|
|
||||||
|
|
||||||
"llvm.org/llvm/bindings/go/llvm"
|
|
||||||
)
|
|
||||||
|
|
||||||
// Expr represents the actual expression in question.
|
|
||||||
type Expr interface{}
|
|
||||||
|
|
||||||
// equaler is used to compare two expressions. The comparison should not take
|
|
||||||
// into account Token values, only the actual value being represented
|
|
||||||
type equaler interface {
|
|
||||||
equal(equaler) bool
|
|
||||||
}
|
|
||||||
|
|
||||||
// will panic if either Expr doesn't implement equaler
|
|
||||||
func exprEqual(e1, e2 Expr) bool {
|
|
||||||
eq1, ok1 := e1.(equaler)
|
|
||||||
eq2, ok2 := e2.(equaler)
|
|
||||||
if !ok1 || !ok2 {
|
|
||||||
panic(fmt.Sprintf("can't compare %T and %T", e1, e2))
|
|
||||||
}
|
|
||||||
return eq1.equal(eq2)
|
|
||||||
}
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
|
|
||||||
// an Expr which simply wraps an existing llvm.Value
|
|
||||||
type llvmVal llvm.Value
|
|
||||||
|
|
||||||
/*
|
|
||||||
func voidVal(lctx LLVMCtx) llvmVal {
|
|
||||||
return llvmVal{lctx.B.CreateRetVoid()}
|
|
||||||
}
|
|
||||||
*/
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
|
|
||||||
/*
|
|
||||||
// Void represents no data (size = 0)
|
|
||||||
type Void struct{}
|
|
||||||
|
|
||||||
func (v Void) equal(e equaler) bool {
|
|
||||||
_, ok := e.(Void)
|
|
||||||
return ok
|
|
||||||
}
|
|
||||||
*/
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
/*
|
|
||||||
// Bool represents a true or false value
|
|
||||||
type Bool bool
|
|
||||||
|
|
||||||
func (b Bool) equal(e equaler) bool {
|
|
||||||
bb, ok := e.(Bool)
|
|
||||||
if !ok {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
return bb == b
|
|
||||||
}
|
|
||||||
*/
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
|
|
||||||
// Int represents an integer value
|
|
||||||
type Int int64
|
|
||||||
|
|
||||||
func (i Int) equal(e equaler) bool {
|
|
||||||
ii, ok := e.(Int)
|
|
||||||
return ok && ii == i
|
|
||||||
}
|
|
||||||
|
|
||||||
func (i Int) String() string {
|
|
||||||
return fmt.Sprintf("%d", i)
|
|
||||||
}
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
/*
|
|
||||||
// String represents a string value
|
|
||||||
type String string
|
|
||||||
|
|
||||||
func (s String) equal(e equaler) bool {
|
|
||||||
ss, ok := e.(String)
|
|
||||||
if !ok {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
return ss == s
|
|
||||||
}
|
|
||||||
*/
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
|
|
||||||
// Identifier represents a binding to some other value which has been given a
|
|
||||||
// name
|
|
||||||
type Identifier string
|
|
||||||
|
|
||||||
func (id Identifier) equal(e equaler) bool {
|
|
||||||
idid, ok := e.(Identifier)
|
|
||||||
return ok && idid == id
|
|
||||||
}
|
|
||||||
|
|
||||||
func isIdentifier(e Expr) bool {
|
|
||||||
_, ok := e.(Identifier)
|
|
||||||
return ok
|
|
||||||
}
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
|
|
||||||
// Macro is an identifier for a macro which can be used to transform
|
|
||||||
// expressions. The tokens for macros start with a '%', but the Macro identifier
|
|
||||||
// itself has that stripped off
|
|
||||||
type Macro string
|
|
||||||
|
|
||||||
// String returns the Macro with a '%' prepended to it
|
|
||||||
func (m Macro) String() string {
|
|
||||||
return "%" + string(m)
|
|
||||||
}
|
|
||||||
|
|
||||||
func (m Macro) equal(e equaler) bool {
|
|
||||||
mm, ok := e.(Macro)
|
|
||||||
return ok && m == mm
|
|
||||||
}
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
|
|
||||||
// Tuple represents a fixed set of expressions which are interacted with as if
|
|
||||||
// they were a single value
|
|
||||||
type Tuple []Expr
|
|
||||||
|
|
||||||
// NewTuple returns a Tuple around the given list of Exprs
|
|
||||||
func NewTuple(ee ...Expr) Tuple {
|
|
||||||
return Tuple(ee)
|
|
||||||
}
|
|
||||||
|
|
||||||
func (tup Tuple) String() string {
|
|
||||||
return "(" + exprsJoin(tup) + ")"
|
|
||||||
}
|
|
||||||
|
|
||||||
func (tup Tuple) equal(e equaler) bool {
|
|
||||||
tuptup, ok := e.(Tuple)
|
|
||||||
return ok && exprsEqual(tup, tuptup)
|
|
||||||
}
|
|
||||||
|
|
||||||
func isTuple(e Expr) bool {
|
|
||||||
_, ok := e.(Tuple)
|
|
||||||
return ok
|
|
||||||
}
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
|
|
||||||
// List represents an ordered set of Exprs, all of the same type. A List's size
|
|
||||||
// does not affect its type signature, unlike a Tuple
|
|
||||||
type List []Expr
|
|
||||||
|
|
||||||
// NewList returns a List around the given list of Exprs
|
|
||||||
func NewList(ee ...Expr) List {
|
|
||||||
return List(ee)
|
|
||||||
}
|
|
||||||
|
|
||||||
func (l List) String() string {
|
|
||||||
return "[" + exprsJoin(l) + "]"
|
|
||||||
}
|
|
||||||
|
|
||||||
func (l List) equal(e equaler) bool {
|
|
||||||
ll, ok := e.(List)
|
|
||||||
return ok && exprsEqual(l, ll)
|
|
||||||
}
|
|
||||||
|
|
||||||
func isList(e Expr) bool {
|
|
||||||
_, ok := e.(List)
|
|
||||||
return ok
|
|
||||||
}
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
|
|
||||||
// Statement represents an actual action which will be taken. The input value is
|
|
||||||
// used as the input to the pipe, and the output of the pipe is the output of
|
|
||||||
// the statement
|
|
||||||
type Statement struct {
|
|
||||||
Op, Arg Expr
|
|
||||||
}
|
|
||||||
|
|
||||||
// NewStatement returns a Statement whose Op is the first Expr. If the given
|
|
||||||
// list is empty Arg will be 0-tuple, if its length is one Arg will be that
|
|
||||||
// single Expr, otherwise Arg will be a Tuple of the list
|
|
||||||
func NewStatement(e Expr, ee ...Expr) Statement {
|
|
||||||
s := Statement{Op: e}
|
|
||||||
if len(ee) > 1 {
|
|
||||||
s.Arg = NewTuple(ee...)
|
|
||||||
} else if len(ee) == 1 {
|
|
||||||
s.Arg = ee[0]
|
|
||||||
} else if len(ee) == 0 {
|
|
||||||
s.Arg = NewTuple()
|
|
||||||
}
|
|
||||||
return s
|
|
||||||
}
|
|
||||||
|
|
||||||
func (s Statement) String() string {
|
|
||||||
return fmt.Sprintf("(%v %s)", s.Op, s.Arg)
|
|
||||||
}
|
|
||||||
|
|
||||||
func (s Statement) equal(e equaler) bool {
|
|
||||||
ss, ok := e.(Statement)
|
|
||||||
return ok && exprEqual(s.Op, ss.Op) && exprEqual(s.Arg, ss.Arg)
|
|
||||||
}
|
|
||||||
|
|
||||||
func isStmt(e Expr) bool {
|
|
||||||
_, ok := e.(Statement)
|
|
||||||
return ok
|
|
||||||
}
|
|
299
expr/parse.go
299
expr/parse.go
@ -1,299 +0,0 @@
|
|||||||
package expr
|
|
||||||
|
|
||||||
//type exprErr struct {
|
|
||||||
// reason string
|
|
||||||
// err error
|
|
||||||
// tok lexer.Token
|
|
||||||
// tokCtx string // e.g. "block starting at" or "open paren at"
|
|
||||||
//}
|
|
||||||
//
|
|
||||||
//func (e exprErr) Error() string {
|
|
||||||
// var msg string
|
|
||||||
// if e.err != nil {
|
|
||||||
// msg = e.err.Error()
|
|
||||||
// } else {
|
|
||||||
// msg = e.reason
|
|
||||||
// }
|
|
||||||
// if err := e.tok.Err(); err != nil {
|
|
||||||
// msg += " - token error: " + err.Error()
|
|
||||||
// } else if (e.tok != lexer.Token{}) {
|
|
||||||
// msg += " - "
|
|
||||||
// if e.tokCtx != "" {
|
|
||||||
// msg += e.tokCtx + ": "
|
|
||||||
// }
|
|
||||||
// msg = fmt.Sprintf("%s [line:%d col:%d]", msg, e.tok.Row, e.tok.Col)
|
|
||||||
// }
|
|
||||||
// return msg
|
|
||||||
//}
|
|
||||||
//
|
|
||||||
//////////////////////////////////////////////////////////////////////////////////
|
|
||||||
//
|
|
||||||
//// toks[0] must be start
|
|
||||||
//func sliceEnclosedToks(toks []lexer.Token, start, end lexer.Token) ([]lexer.Token, []lexer.Token, error) {
|
|
||||||
// c := 1
|
|
||||||
// ret := []lexer.Token{}
|
|
||||||
// first := toks[0]
|
|
||||||
// for i, tok := range toks[1:] {
|
|
||||||
// if tok.Err() != nil {
|
|
||||||
// return nil, nil, exprErr{
|
|
||||||
// reason: fmt.Sprintf("missing closing %v", end),
|
|
||||||
// tok: tok,
|
|
||||||
// }
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// if tok.Equal(start) {
|
|
||||||
// c++
|
|
||||||
// } else if tok.Equal(end) {
|
|
||||||
// c--
|
|
||||||
// }
|
|
||||||
// if c == 0 {
|
|
||||||
// return ret, toks[2+i:], nil
|
|
||||||
// }
|
|
||||||
// ret = append(ret, tok)
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// return nil, nil, exprErr{
|
|
||||||
// reason: fmt.Sprintf("missing closing %v", end),
|
|
||||||
// tok: first,
|
|
||||||
// tokCtx: "starting at",
|
|
||||||
// }
|
|
||||||
//}
|
|
||||||
//
|
|
||||||
//// Parse reads in all expressions it can from the given io.Reader and returns
|
|
||||||
//// them
|
|
||||||
//func Parse(r io.Reader) ([]Expr, error) {
|
|
||||||
// toks := readAllToks(r)
|
|
||||||
// var ret []Expr
|
|
||||||
// var expr Expr
|
|
||||||
// var err error
|
|
||||||
// for len(toks) > 0 {
|
|
||||||
// if toks[0].TokenType == lexer.EOF {
|
|
||||||
// return ret, nil
|
|
||||||
// }
|
|
||||||
// expr, toks, err = parse(toks)
|
|
||||||
// if err != nil {
|
|
||||||
// return nil, err
|
|
||||||
// }
|
|
||||||
// ret = append(ret, expr)
|
|
||||||
// }
|
|
||||||
// return ret, nil
|
|
||||||
//}
|
|
||||||
//
|
|
||||||
//// ParseAsBlock reads the given io.Reader as if it was implicitly surrounded by
|
|
||||||
//// curly braces, making it into a Block. This means all expressions from the
|
|
||||||
//// io.Reader *must* be statements. The returned Expr's Actual will always be a
|
|
||||||
//// Block.
|
|
||||||
//func ParseAsBlock(r io.Reader) (Expr, error) {
|
|
||||||
// return parseBlock(readAllToks(r))
|
|
||||||
//}
|
|
||||||
//
|
|
||||||
//func readAllToks(r io.Reader) []lexer.Token {
|
|
||||||
// l := lexer.New(r)
|
|
||||||
// var toks []lexer.Token
|
|
||||||
// for l.HasNext() {
|
|
||||||
// toks = append(toks, l.Next())
|
|
||||||
// }
|
|
||||||
// return toks
|
|
||||||
//}
|
|
||||||
//
|
|
||||||
//// For all parse methods it is assumed that toks is not empty
|
|
||||||
//
|
|
||||||
//var (
|
|
||||||
// openParen = lexer.Token{TokenType: lexer.Wrapper, Val: "("}
|
|
||||||
// closeParen = lexer.Token{TokenType: lexer.Wrapper, Val: ")"}
|
|
||||||
// openCurly = lexer.Token{TokenType: lexer.Wrapper, Val: "{"}
|
|
||||||
// closeCurly = lexer.Token{TokenType: lexer.Wrapper, Val: "}"}
|
|
||||||
// comma = lexer.Token{TokenType: lexer.Punctuation, Val: ","}
|
|
||||||
// arrow = lexer.Token{TokenType: lexer.Punctuation, Val: ">"}
|
|
||||||
//)
|
|
||||||
//
|
|
||||||
//func parse(toks []lexer.Token) (Expr, []lexer.Token, error) {
|
|
||||||
// expr, toks, err := parseSingle(toks)
|
|
||||||
// if err != nil {
|
|
||||||
// return Expr{}, nil, err
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// if len(toks) > 0 && toks[0].TokenType == lexer.Punctuation {
|
|
||||||
// return parseConnectingPunct(toks, expr)
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// return expr, toks, nil
|
|
||||||
//}
|
|
||||||
//
|
|
||||||
//func parseSingle(toks []lexer.Token) (Expr, []lexer.Token, error) {
|
|
||||||
// var expr Expr
|
|
||||||
// var err error
|
|
||||||
//
|
|
||||||
// if toks[0].Err() != nil {
|
|
||||||
// return Expr{}, nil, exprErr{
|
|
||||||
// reason: "could not parse token",
|
|
||||||
// tok: toks[0],
|
|
||||||
// }
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// if toks[0].Equal(openParen) {
|
|
||||||
// starter := toks[0]
|
|
||||||
// var ptoks []lexer.Token
|
|
||||||
// ptoks, toks, err = sliceEnclosedToks(toks, openParen, closeParen)
|
|
||||||
// if err != nil {
|
|
||||||
// return Expr{}, nil, err
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// if expr, ptoks, err = parse(ptoks); err != nil {
|
|
||||||
// return Expr{}, nil, err
|
|
||||||
// } else if len(ptoks) > 0 {
|
|
||||||
// return Expr{}, nil, exprErr{
|
|
||||||
// reason: "multiple expressions inside parenthesis",
|
|
||||||
// tok: starter,
|
|
||||||
// tokCtx: "starting at",
|
|
||||||
// }
|
|
||||||
// }
|
|
||||||
// return expr, toks, nil
|
|
||||||
//
|
|
||||||
// } else if toks[0].Equal(openCurly) {
|
|
||||||
// var btoks []lexer.Token
|
|
||||||
// btoks, toks, err = sliceEnclosedToks(toks, openCurly, closeCurly)
|
|
||||||
// if err != nil {
|
|
||||||
// return Expr{}, nil, err
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// if expr, err = parseBlock(btoks); err != nil {
|
|
||||||
// return Expr{}, nil, err
|
|
||||||
// }
|
|
||||||
// return expr, toks, nil
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// if expr, err = parseNonPunct(toks[0]); err != nil {
|
|
||||||
// return Expr{}, nil, err
|
|
||||||
// }
|
|
||||||
// return expr, toks[1:], nil
|
|
||||||
//}
|
|
||||||
//
|
|
||||||
//func parseNonPunct(tok lexer.Token) (Expr, error) {
|
|
||||||
// if tok.TokenType == lexer.Identifier {
|
|
||||||
// return parseIdentifier(tok)
|
|
||||||
// } else if tok.TokenType == lexer.String {
|
|
||||||
// //return parseString(tok)
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// return Expr{}, exprErr{
|
|
||||||
// reason: "unexpected non-punctuation token",
|
|
||||||
// tok: tok,
|
|
||||||
// }
|
|
||||||
//}
|
|
||||||
//
|
|
||||||
//func parseIdentifier(t lexer.Token) (Expr, error) {
|
|
||||||
// e := Expr{Token: t}
|
|
||||||
// if t.Val[0] == '-' || (t.Val[0] >= '0' && t.Val[0] <= '9') {
|
|
||||||
// n, err := strconv.ParseInt(t.Val, 10, 64)
|
|
||||||
// if err != nil {
|
|
||||||
// return Expr{}, exprErr{
|
|
||||||
// err: err,
|
|
||||||
// tok: t,
|
|
||||||
// }
|
|
||||||
// }
|
|
||||||
// e.Actual = Int(n)
|
|
||||||
//
|
|
||||||
// /*
|
|
||||||
// } else if t.Val == "%true" {
|
|
||||||
// e.Actual = Bool(true)
|
|
||||||
//
|
|
||||||
// } else if t.Val == "%false" {
|
|
||||||
// e.Actual = Bool(false)
|
|
||||||
// */
|
|
||||||
//
|
|
||||||
// } else if t.Val[0] == '%' {
|
|
||||||
// e.Actual = Macro(t.Val[1:])
|
|
||||||
//
|
|
||||||
// } else {
|
|
||||||
// e.Actual = Identifier(t.Val)
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// return e, nil
|
|
||||||
//}
|
|
||||||
//
|
|
||||||
///*
|
|
||||||
//func parseString(t lexer.Token) (Expr, error) {
|
|
||||||
// str, err := strconv.Unquote(t.Val)
|
|
||||||
// if err != nil {
|
|
||||||
// return Expr{}, exprErr{
|
|
||||||
// err: err,
|
|
||||||
// tok: t,
|
|
||||||
// }
|
|
||||||
// }
|
|
||||||
// return Expr{Token: t, Actual: String(str)}, nil
|
|
||||||
//}
|
|
||||||
//*/
|
|
||||||
//
|
|
||||||
//func parseConnectingPunct(toks []lexer.Token, root Expr) (Expr, []lexer.Token, error) {
|
|
||||||
// if toks[0].Equal(comma) {
|
|
||||||
// return parseTuple(toks, root)
|
|
||||||
//
|
|
||||||
// } else if toks[0].Equal(arrow) {
|
|
||||||
// expr, toks, err := parse(toks[1:])
|
|
||||||
// if err != nil {
|
|
||||||
// return Expr{}, nil, err
|
|
||||||
// }
|
|
||||||
// return Expr{Token: root.Token, Actual: Statement{In: root, To: expr}}, toks, nil
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// return root, toks, nil
|
|
||||||
//}
|
|
||||||
//
|
|
||||||
//func parseTuple(toks []lexer.Token, root Expr) (Expr, []lexer.Token, error) {
|
|
||||||
// rootTup, ok := root.Actual.(Tuple)
|
|
||||||
// if !ok {
|
|
||||||
// rootTup = Tuple{root}
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// // rootTup is modified throughout, be we need to make it into an Expr for
|
|
||||||
// // every return, which is annoying. so make a function to do it on the fly
|
|
||||||
// mkRoot := func() Expr {
|
|
||||||
// return Expr{Token: rootTup[0].Token, Actual: rootTup}
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// if len(toks) < 2 {
|
|
||||||
// return mkRoot(), toks, nil
|
|
||||||
// } else if !toks[0].Equal(comma) {
|
|
||||||
// if toks[0].TokenType == lexer.Punctuation {
|
|
||||||
// return parseConnectingPunct(toks, mkRoot())
|
|
||||||
// }
|
|
||||||
// return mkRoot(), toks, nil
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// var expr Expr
|
|
||||||
// var err error
|
|
||||||
// if expr, toks, err = parseSingle(toks[1:]); err != nil {
|
|
||||||
// return Expr{}, nil, err
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// rootTup = append(rootTup, expr)
|
|
||||||
// return parseTuple(toks, mkRoot())
|
|
||||||
//}
|
|
||||||
//
|
|
||||||
//// parseBlock assumes that the given token list is the entire block, already
|
|
||||||
//// pulled from outer curly braces by sliceEnclosedToks, or determined to be the
|
|
||||||
//// entire block in some other way.
|
|
||||||
//func parseBlock(toks []lexer.Token) (Expr, error) {
|
|
||||||
// b := Block{}
|
|
||||||
// first := toks[0]
|
|
||||||
// var expr Expr
|
|
||||||
// var err error
|
|
||||||
// for {
|
|
||||||
// if len(toks) == 0 {
|
|
||||||
// return Expr{Token: first, Actual: b}, nil
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// if expr, toks, err = parse(toks); err != nil {
|
|
||||||
// return Expr{}, err
|
|
||||||
// }
|
|
||||||
// if _, ok := expr.Actual.(Statement); !ok {
|
|
||||||
// return Expr{}, exprErr{
|
|
||||||
// reason: "blocks may only contain full statements",
|
|
||||||
// tok: expr.Token,
|
|
||||||
// tokCtx: "non-statement here",
|
|
||||||
// }
|
|
||||||
// }
|
|
||||||
// b = append(b, expr)
|
|
||||||
// }
|
|
||||||
//}
|
|
@ -1,149 +0,0 @@
|
|||||||
package expr
|
|
||||||
|
|
||||||
//import . "testing"
|
|
||||||
|
|
||||||
//func TestSliceEnclosedToks(t *T) {
|
|
||||||
// doAssert := func(in, expOut, expRem []lexer.Token) {
|
|
||||||
// out, rem, err := sliceEnclosedToks(in, openParen, closeParen)
|
|
||||||
// require.Nil(t, err)
|
|
||||||
// assert.Equal(t, expOut, out)
|
|
||||||
// assert.Equal(t, expRem, rem)
|
|
||||||
// }
|
|
||||||
// foo := lexer.Token{TokenType: lexer.Identifier, Val: "foo"}
|
|
||||||
// bar := lexer.Token{TokenType: lexer.Identifier, Val: "bar"}
|
|
||||||
//
|
|
||||||
// toks := []lexer.Token{openParen, closeParen}
|
|
||||||
// doAssert(toks, []lexer.Token{}, []lexer.Token{})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{openParen, foo, closeParen, bar}
|
|
||||||
// doAssert(toks, []lexer.Token{foo}, []lexer.Token{bar})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{openParen, foo, foo, closeParen, bar, bar}
|
|
||||||
// doAssert(toks, []lexer.Token{foo, foo}, []lexer.Token{bar, bar})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{openParen, foo, openParen, bar, closeParen, closeParen}
|
|
||||||
// doAssert(toks, []lexer.Token{foo, openParen, bar, closeParen}, []lexer.Token{})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{openParen, foo, openParen, bar, closeParen, bar, closeParen, foo}
|
|
||||||
// doAssert(toks, []lexer.Token{foo, openParen, bar, closeParen, bar}, []lexer.Token{foo})
|
|
||||||
//}
|
|
||||||
//
|
|
||||||
//func assertParse(t *T, in []lexer.Token, expExpr Expr, expOut []lexer.Token) {
|
|
||||||
// expr, out, err := parse(in)
|
|
||||||
// require.Nil(t, err)
|
|
||||||
// assert.True(t, expExpr.equal(expr), "expr:%+v expExpr:%+v", expr, expExpr)
|
|
||||||
// assert.Equal(t, expOut, out, "out:%v expOut:%v", out, expOut)
|
|
||||||
//}
|
|
||||||
//
|
|
||||||
//func TestParseSingle(t *T) {
|
|
||||||
// foo := lexer.Token{TokenType: lexer.Identifier, Val: "foo"}
|
|
||||||
// fooM := lexer.Token{TokenType: lexer.Identifier, Val: "%foo"}
|
|
||||||
// fooExpr := Expr{Actual: Identifier("foo")}
|
|
||||||
// fooMExpr := Expr{Actual: Macro("foo")}
|
|
||||||
//
|
|
||||||
// toks := []lexer.Token{foo}
|
|
||||||
// assertParse(t, toks, fooExpr, []lexer.Token{})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{foo, foo}
|
|
||||||
// assertParse(t, toks, fooExpr, []lexer.Token{foo})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{openParen, foo, closeParen, foo}
|
|
||||||
// assertParse(t, toks, fooExpr, []lexer.Token{foo})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{openParen, openParen, foo, closeParen, closeParen, foo}
|
|
||||||
// assertParse(t, toks, fooExpr, []lexer.Token{foo})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{fooM, foo}
|
|
||||||
// assertParse(t, toks, fooMExpr, []lexer.Token{foo})
|
|
||||||
//}
|
|
||||||
//
|
|
||||||
//func TestParseTuple(t *T) {
|
|
||||||
// tup := func(ee ...Expr) Expr {
|
|
||||||
// return Expr{Actual: Tuple(ee)}
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// foo := lexer.Token{TokenType: lexer.Identifier, Val: "foo"}
|
|
||||||
// fooExpr := Expr{Actual: Identifier("foo")}
|
|
||||||
//
|
|
||||||
// toks := []lexer.Token{foo, comma, foo}
|
|
||||||
// assertParse(t, toks, tup(fooExpr, fooExpr), []lexer.Token{})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{foo, comma, foo, foo}
|
|
||||||
// assertParse(t, toks, tup(fooExpr, fooExpr), []lexer.Token{foo})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{foo, comma, foo, comma, foo}
|
|
||||||
// assertParse(t, toks, tup(fooExpr, fooExpr, fooExpr), []lexer.Token{})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{foo, comma, foo, comma, foo, comma, foo}
|
|
||||||
// assertParse(t, toks, tup(fooExpr, fooExpr, fooExpr, fooExpr), []lexer.Token{})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{foo, comma, openParen, foo, comma, foo, closeParen, comma, foo}
|
|
||||||
// assertParse(t, toks, tup(fooExpr, tup(fooExpr, fooExpr), fooExpr), []lexer.Token{})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{foo, comma, openParen, foo, comma, foo, closeParen, comma, foo, foo}
|
|
||||||
// assertParse(t, toks, tup(fooExpr, tup(fooExpr, fooExpr), fooExpr), []lexer.Token{foo})
|
|
||||||
//}
|
|
||||||
//
|
|
||||||
//func TestParseStatement(t *T) {
|
|
||||||
// stmt := func(in, to Expr) Expr {
|
|
||||||
// return Expr{Actual: Statement{In: in, To: to}}
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// foo := lexer.Token{TokenType: lexer.Identifier, Val: "foo"}
|
|
||||||
// fooExpr := Expr{Actual: Identifier("foo")}
|
|
||||||
//
|
|
||||||
// toks := []lexer.Token{foo, arrow, foo}
|
|
||||||
// assertParse(t, toks, stmt(fooExpr, fooExpr), []lexer.Token{})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{openParen, foo, arrow, foo, closeParen}
|
|
||||||
// assertParse(t, toks, stmt(fooExpr, fooExpr), []lexer.Token{})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{foo, arrow, openParen, foo, closeParen}
|
|
||||||
// assertParse(t, toks, stmt(fooExpr, fooExpr), []lexer.Token{})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{foo, arrow, foo}
|
|
||||||
// assertParse(t, toks, stmt(fooExpr, fooExpr), []lexer.Token{})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{foo, arrow, foo, foo}
|
|
||||||
// assertParse(t, toks, stmt(fooExpr, fooExpr), []lexer.Token{foo})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{foo, arrow, openParen, foo, closeParen, foo}
|
|
||||||
// assertParse(t, toks, stmt(fooExpr, fooExpr), []lexer.Token{foo})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{openParen, foo, closeParen, arrow, openParen, foo, closeParen, foo}
|
|
||||||
// assertParse(t, toks, stmt(fooExpr, fooExpr), []lexer.Token{foo})
|
|
||||||
//
|
|
||||||
// fooTupExpr := Expr{Actual: Tuple{fooExpr, fooExpr}}
|
|
||||||
// toks = []lexer.Token{foo, arrow, openParen, foo, comma, foo, closeParen, foo}
|
|
||||||
// assertParse(t, toks, stmt(fooExpr, fooTupExpr), []lexer.Token{foo})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{foo, comma, foo, arrow, foo}
|
|
||||||
// assertParse(t, toks, stmt(fooTupExpr, fooExpr), []lexer.Token{})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{openParen, foo, comma, foo, closeParen, arrow, foo}
|
|
||||||
// assertParse(t, toks, stmt(fooTupExpr, fooExpr), []lexer.Token{})
|
|
||||||
//}
|
|
||||||
//
|
|
||||||
//func TestParseBlock(t *T) {
|
|
||||||
// stmt := func(in, to Expr) Expr {
|
|
||||||
// return Expr{Actual: Statement{In: in, To: to}}
|
|
||||||
// }
|
|
||||||
// block := func(stmts ...Expr) Expr {
|
|
||||||
// return Expr{Actual: Block(stmts)}
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// foo := lexer.Token{TokenType: lexer.Identifier, Val: "foo"}
|
|
||||||
// fooExpr := Expr{Actual: Identifier("foo")}
|
|
||||||
//
|
|
||||||
// toks := []lexer.Token{openCurly, foo, arrow, foo, closeCurly}
|
|
||||||
// assertParse(t, toks, block(stmt(fooExpr, fooExpr)), []lexer.Token{})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{openCurly, foo, arrow, foo, closeCurly, foo}
|
|
||||||
// assertParse(t, toks, block(stmt(fooExpr, fooExpr)), []lexer.Token{foo})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{openCurly, foo, arrow, foo, openParen, foo, arrow, foo, closeParen, closeCurly, foo}
|
|
||||||
// assertParse(t, toks, block(stmt(fooExpr, fooExpr), stmt(fooExpr, fooExpr)), []lexer.Token{foo})
|
|
||||||
//
|
|
||||||
// toks = []lexer.Token{openCurly, foo, arrow, foo, openParen, foo, arrow, foo, closeParen, closeCurly, foo}
|
|
||||||
// assertParse(t, toks, block(stmt(fooExpr, fooExpr), stmt(fooExpr, fooExpr)), []lexer.Token{foo})
|
|
||||||
//}
|
|
40
expr/util.go
40
expr/util.go
@ -1,40 +0,0 @@
|
|||||||
package expr
|
|
||||||
|
|
||||||
import (
|
|
||||||
"encoding/hex"
|
|
||||||
"fmt"
|
|
||||||
"math/rand"
|
|
||||||
"strings"
|
|
||||||
)
|
|
||||||
|
|
||||||
func randStr() string {
|
|
||||||
b := make([]byte, 16)
|
|
||||||
if _, err := rand.Read(b); err != nil {
|
|
||||||
panic(err)
|
|
||||||
}
|
|
||||||
return hex.EncodeToString(b)
|
|
||||||
}
|
|
||||||
|
|
||||||
func exprsJoin(ee []Expr) string {
|
|
||||||
strs := make([]string, len(ee))
|
|
||||||
for i := range ee {
|
|
||||||
strs[i] = fmt.Sprint(ee[i])
|
|
||||||
}
|
|
||||||
return strings.Join(strs, ", ")
|
|
||||||
}
|
|
||||||
|
|
||||||
func exprsEqual(ee1, ee2 []Expr) bool {
|
|
||||||
if len(ee1) != len(ee2) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
for i := range ee1 {
|
|
||||||
if !exprEqual(ee1[i], ee2[i]) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return true
|
|
||||||
}
|
|
||||||
|
|
||||||
func panicf(msg string, args ...interface{}) {
|
|
||||||
panic(fmt.Sprintf(msg, args...))
|
|
||||||
}
|
|
26
flake.lock
Normal file
26
flake.lock
Normal file
@ -0,0 +1,26 @@
|
|||||||
|
{
|
||||||
|
"nodes": {
|
||||||
|
"nixpkgs": {
|
||||||
|
"locked": {
|
||||||
|
"lastModified": 1696983906,
|
||||||
|
"narHash": "sha256-L7GyeErguS7Pg4h8nK0wGlcUTbfUMDu+HMf1UcyP72k=",
|
||||||
|
"owner": "NixOS",
|
||||||
|
"repo": "nixpkgs",
|
||||||
|
"rev": "bd1cde45c77891214131cbbea5b1203e485a9d51",
|
||||||
|
"type": "github"
|
||||||
|
},
|
||||||
|
"original": {
|
||||||
|
"id": "nixpkgs",
|
||||||
|
"ref": "nixos-23.05",
|
||||||
|
"type": "indirect"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"root": {
|
||||||
|
"inputs": {
|
||||||
|
"nixpkgs": "nixpkgs"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"root": "root",
|
||||||
|
"version": 7
|
||||||
|
}
|
44
flake.nix
Normal file
44
flake.nix
Normal file
@ -0,0 +1,44 @@
|
|||||||
|
{
|
||||||
|
description = "gotc development environment";
|
||||||
|
|
||||||
|
# Nixpkgs / NixOS version to use.
|
||||||
|
inputs.nixpkgs.url = "nixpkgs/nixos-23.05";
|
||||||
|
|
||||||
|
outputs = { self, nixpkgs }:
|
||||||
|
let
|
||||||
|
|
||||||
|
# to work with older version of flakes
|
||||||
|
lastModifiedDate = self.lastModifiedDate or self.lastModified or "19700101";
|
||||||
|
|
||||||
|
# Generate a user-friendly version number.
|
||||||
|
version = builtins.substring 0 8 lastModifiedDate;
|
||||||
|
|
||||||
|
# System types to support.
|
||||||
|
supportedSystems = [ "x86_64-linux" "x86_64-darwin" "aarch64-linux" "aarch64-darwin" ];
|
||||||
|
|
||||||
|
# Helper function to generate an attrset '{ x86_64-linux = f "x86_64-linux"; ... }'.
|
||||||
|
forAllSystems = nixpkgs.lib.genAttrs supportedSystems;
|
||||||
|
|
||||||
|
# Nixpkgs instantiated for supported system types.
|
||||||
|
nixpkgsFor = forAllSystems (system: import nixpkgs {
|
||||||
|
inherit system;
|
||||||
|
});
|
||||||
|
|
||||||
|
in
|
||||||
|
{
|
||||||
|
|
||||||
|
# Add dependencies that are only needed for development
|
||||||
|
devShells = forAllSystems (system:
|
||||||
|
let
|
||||||
|
pkgs = nixpkgsFor.${system};
|
||||||
|
in {
|
||||||
|
default = pkgs.mkShell {
|
||||||
|
buildInputs = [
|
||||||
|
pkgs.go
|
||||||
|
pkgs.gotools
|
||||||
|
pkgs.golangci-lint
|
||||||
|
];
|
||||||
|
};
|
||||||
|
});
|
||||||
|
};
|
||||||
|
}
|
297
gg/decoder.go
Normal file
297
gg/decoder.go
Normal file
@ -0,0 +1,297 @@
|
|||||||
|
package gg
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"strconv"
|
||||||
|
"unicode"
|
||||||
|
|
||||||
|
. "code.betamike.com/mediocregopher/ginger/gg/grammar"
|
||||||
|
"code.betamike.com/mediocregopher/ginger/graph"
|
||||||
|
"golang.org/x/exp/slices"
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
notNewline = RuneFunc(
|
||||||
|
"not-newline", func(r rune) bool { return r != '\n' },
|
||||||
|
)
|
||||||
|
|
||||||
|
comment = Prefixed(
|
||||||
|
Prefixed(Rune('*'), ZeroOrMore(notNewline)), Rune('\n'),
|
||||||
|
)
|
||||||
|
|
||||||
|
whitespace = ZeroOrMore(FirstOf(
|
||||||
|
Discard(RuneFunc("whitespace", unicode.IsSpace)),
|
||||||
|
Discard(comment),
|
||||||
|
))
|
||||||
|
)
|
||||||
|
|
||||||
|
func trimmed[T any](sym Symbol[T]) Symbol[T] {
|
||||||
|
sym = PrefixDiscarded(whitespace, sym)
|
||||||
|
sym = Suffixed(sym, whitespace)
|
||||||
|
return sym
|
||||||
|
}
|
||||||
|
|
||||||
|
func trimmedRune(r rune) Symbol[Located[rune]] {
|
||||||
|
return trimmed(Rune(r))
|
||||||
|
}
|
||||||
|
|
||||||
|
var (
|
||||||
|
digit = RuneFunc(
|
||||||
|
"digit", func(r rune) bool { return '0' <= r && r <= '9' },
|
||||||
|
)
|
||||||
|
|
||||||
|
positiveNumber = StringFromRunes(OneOrMore(digit))
|
||||||
|
|
||||||
|
negativeNumber = Reduction(
|
||||||
|
Rune('-'),
|
||||||
|
positiveNumber,
|
||||||
|
func(neg Located[rune], posNum Located[string]) Located[string] {
|
||||||
|
return Locate(neg.Location, string(neg.Value)+posNum.Value)
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
number = Named(
|
||||||
|
"number",
|
||||||
|
Mapping(
|
||||||
|
FirstOf(negativeNumber, positiveNumber),
|
||||||
|
func(str Located[string]) Located[Value] {
|
||||||
|
i, err := strconv.ParseInt(str.Value, 10, 64)
|
||||||
|
if err != nil {
|
||||||
|
panic(fmt.Errorf("parsing %q as int: %w", str, err))
|
||||||
|
}
|
||||||
|
|
||||||
|
return Locate(str.Location, Number(i))
|
||||||
|
},
|
||||||
|
),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
nameHead = FirstOf(
|
||||||
|
RuneFunc("letter", unicode.IsLetter),
|
||||||
|
RuneFunc("mark", unicode.IsMark),
|
||||||
|
Rune('!'),
|
||||||
|
)
|
||||||
|
|
||||||
|
nameTail = ZeroOrMore(FirstOf(nameHead, digit))
|
||||||
|
|
||||||
|
name = Named(
|
||||||
|
"name",
|
||||||
|
Reduction(
|
||||||
|
nameHead,
|
||||||
|
nameTail,
|
||||||
|
func(head Located[rune], tail []Located[rune]) Located[Value] {
|
||||||
|
name := make([]rune, 0, len(tail)+1)
|
||||||
|
name = append(name, head.Value)
|
||||||
|
for _, r := range tail {
|
||||||
|
name = append(name, r.Value)
|
||||||
|
}
|
||||||
|
return Locate(head.Location, Name(string(name)))
|
||||||
|
},
|
||||||
|
),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
func openEdgeIntoValue(val Value, oe *OpenEdge) *OpenEdge {
|
||||||
|
switch {
|
||||||
|
case oe == nil:
|
||||||
|
return graph.ValueOut(None, val)
|
||||||
|
case !oe.EdgeValue().Valid:
|
||||||
|
return oe.WithEdgeValue(Some(val))
|
||||||
|
default:
|
||||||
|
return graph.TupleOut(Some(val), oe)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
var graphSym, value = func() (
|
||||||
|
Symbol[Located[Value]], Symbol[Located[Value]],
|
||||||
|
) {
|
||||||
|
|
||||||
|
type tupleState struct {
|
||||||
|
ins []*OpenEdge
|
||||||
|
oe *OpenEdge
|
||||||
|
}
|
||||||
|
|
||||||
|
type graphState struct {
|
||||||
|
g *Graph
|
||||||
|
oe *OpenEdge
|
||||||
|
}
|
||||||
|
|
||||||
|
var (
|
||||||
|
tupleEnd = Mapping(
|
||||||
|
trimmedRune(')'),
|
||||||
|
func(Located[rune]) tupleState {
|
||||||
|
// if ')', then map that to an empty state. This acts as a
|
||||||
|
// sentinel value to indicate "end of tuple".
|
||||||
|
return tupleState{}
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
graphEnd = Mapping(
|
||||||
|
trimmedRune('}'),
|
||||||
|
func(Located[rune]) graphState {
|
||||||
|
// if '}', then map that to an empty state. This acts as a
|
||||||
|
// sentinel value to indicate "end of graph".
|
||||||
|
return graphState{}
|
||||||
|
},
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
// pre-define these, and then fill in the pointers after, in order to
|
||||||
|
// deal with recursive dependencies between them.
|
||||||
|
value = new(SymbolPtr[Located[Value]])
|
||||||
|
|
||||||
|
tuple = new(SymbolPtr[*OpenEdge])
|
||||||
|
tupleTail = new(SymbolPtr[tupleState])
|
||||||
|
tupleOpenEdge = new(SymbolPtr[tupleState])
|
||||||
|
tupleOpenEdgeTail = new(SymbolPtr[tupleState])
|
||||||
|
tupleOpenEdgeValueTail = new(SymbolPtr[tupleState])
|
||||||
|
|
||||||
|
graphSym = new(SymbolPtr[Located[Value]])
|
||||||
|
graphTail = new(SymbolPtr[graphState])
|
||||||
|
graphOpenEdge = new(SymbolPtr[graphState])
|
||||||
|
graphOpenEdgeTail = new(SymbolPtr[graphState])
|
||||||
|
graphOpenEdgeValueTail = new(SymbolPtr[graphState])
|
||||||
|
)
|
||||||
|
|
||||||
|
tuple.Symbol = Named(
|
||||||
|
"tuple",
|
||||||
|
Reduction[Located[rune], tupleState, *OpenEdge](
|
||||||
|
trimmedRune('('),
|
||||||
|
tupleTail,
|
||||||
|
func(_ Located[rune], ts tupleState) *OpenEdge {
|
||||||
|
slices.Reverse(ts.ins)
|
||||||
|
return graph.TupleOut(None, ts.ins...)
|
||||||
|
},
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
tupleTail.Symbol = FirstOf(
|
||||||
|
tupleEnd,
|
||||||
|
Mapping[tupleState, tupleState](
|
||||||
|
tupleOpenEdge,
|
||||||
|
func(ts tupleState) tupleState {
|
||||||
|
ts.ins = append(ts.ins, ts.oe)
|
||||||
|
ts.oe = nil
|
||||||
|
return ts
|
||||||
|
},
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
tupleOpenEdge.Symbol = FirstOf(
|
||||||
|
Reduction[Located[Value], tupleState, tupleState](
|
||||||
|
value,
|
||||||
|
tupleOpenEdgeValueTail,
|
||||||
|
func(val Located[Value], ts tupleState) tupleState {
|
||||||
|
ts.oe = openEdgeIntoValue(val.Value, ts.oe)
|
||||||
|
return ts
|
||||||
|
},
|
||||||
|
),
|
||||||
|
Reduction[*OpenEdge, tupleState, tupleState](
|
||||||
|
tuple,
|
||||||
|
tupleOpenEdgeTail,
|
||||||
|
func(oe *OpenEdge, ts tupleState) tupleState {
|
||||||
|
ts.oe = oe
|
||||||
|
return ts
|
||||||
|
},
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
tupleOpenEdgeTail.Symbol = FirstOf(
|
||||||
|
tupleEnd,
|
||||||
|
Prefixed[Located[rune], tupleState](trimmedRune(','), tupleTail),
|
||||||
|
)
|
||||||
|
|
||||||
|
tupleOpenEdgeValueTail.Symbol = FirstOf[tupleState](
|
||||||
|
tupleOpenEdgeTail,
|
||||||
|
Prefixed[Located[rune], tupleState](trimmedRune('<'), tupleOpenEdge),
|
||||||
|
)
|
||||||
|
|
||||||
|
graphSym.Symbol = Named(
|
||||||
|
"graph",
|
||||||
|
Reduction[Located[rune], graphState, Located[Value]](
|
||||||
|
trimmedRune('{'),
|
||||||
|
graphTail,
|
||||||
|
func(r Located[rune], gs graphState) Located[Value] {
|
||||||
|
if gs.g == nil {
|
||||||
|
gs.g = new(Graph)
|
||||||
|
}
|
||||||
|
|
||||||
|
return Locate(r.Location, Value{Graph: gs.g})
|
||||||
|
},
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
graphTail.Symbol = FirstOf(
|
||||||
|
graphEnd,
|
||||||
|
Reduction(
|
||||||
|
name,
|
||||||
|
Prefixed[Located[rune], graphState](
|
||||||
|
trimmedRune('='), graphOpenEdge,
|
||||||
|
),
|
||||||
|
func(name Located[Value], gs graphState) graphState {
|
||||||
|
if gs.g == nil {
|
||||||
|
gs.g = new(Graph)
|
||||||
|
}
|
||||||
|
|
||||||
|
gs.g = gs.g.AddValueIn(name.Value, gs.oe)
|
||||||
|
gs.oe = nil
|
||||||
|
return gs
|
||||||
|
},
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
graphOpenEdge.Symbol = FirstOf(
|
||||||
|
Reduction[Located[Value], graphState, graphState](
|
||||||
|
value,
|
||||||
|
graphOpenEdgeValueTail,
|
||||||
|
func(val Located[Value], gs graphState) graphState {
|
||||||
|
gs.oe = openEdgeIntoValue(val.Value, gs.oe)
|
||||||
|
return gs
|
||||||
|
},
|
||||||
|
),
|
||||||
|
Reduction[*OpenEdge, graphState, graphState](
|
||||||
|
tuple,
|
||||||
|
graphOpenEdgeTail,
|
||||||
|
func(oe *OpenEdge, gs graphState) graphState {
|
||||||
|
gs.oe = oe
|
||||||
|
return gs
|
||||||
|
},
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
graphOpenEdgeTail.Symbol = FirstOf(
|
||||||
|
graphEnd,
|
||||||
|
Prefixed[Located[rune], graphState](trimmedRune(';'), graphTail),
|
||||||
|
)
|
||||||
|
|
||||||
|
graphOpenEdgeValueTail.Symbol = FirstOf[graphState](
|
||||||
|
graphOpenEdgeTail,
|
||||||
|
Prefixed[Located[rune], graphState](trimmedRune('<'), graphOpenEdge),
|
||||||
|
)
|
||||||
|
|
||||||
|
value.Symbol = trimmed(FirstOf[Located[Value]](name, number, graphSym))
|
||||||
|
|
||||||
|
return graphSym, value
|
||||||
|
}()
|
||||||
|
|
||||||
|
// Decoder reads Values off of an io.Reader, or return io.EOF.
|
||||||
|
type Decoder interface {
|
||||||
|
Next() (Located[Value], error)
|
||||||
|
}
|
||||||
|
|
||||||
|
type decoder struct {
|
||||||
|
r Reader
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewDecoder returns a Decoder which reads off the given io.Reader. The
|
||||||
|
// io.Reader should not be read from after this call.
|
||||||
|
func NewDecoder(r io.Reader) Decoder {
|
||||||
|
return &decoder{r: NewReader(r)}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (d *decoder) Next() (Located[Value], error) {
|
||||||
|
return value.Decode(d.r)
|
||||||
|
}
|
365
gg/decoder_test.go
Normal file
365
gg/decoder_test.go
Normal file
@ -0,0 +1,365 @@
|
|||||||
|
package gg
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"strconv"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
. "code.betamike.com/mediocregopher/ginger/gg/grammar"
|
||||||
|
"code.betamike.com/mediocregopher/ginger/graph"
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestDecoder(t *testing.T) {
|
||||||
|
type test struct {
|
||||||
|
in string
|
||||||
|
exp Located[Value]
|
||||||
|
expErr string
|
||||||
|
}
|
||||||
|
|
||||||
|
runTests := func(
|
||||||
|
t *testing.T, name string, sym Symbol[Located[Value]], tests []test,
|
||||||
|
) {
|
||||||
|
t.Run(name, func(t *testing.T) {
|
||||||
|
for i, test := range tests {
|
||||||
|
t.Run(strconv.Itoa(i), func(t *testing.T) {
|
||||||
|
r := NewReader(bytes.NewBufferString(test.in))
|
||||||
|
got, err := sym.Decode(r)
|
||||||
|
if test.expErr != "" {
|
||||||
|
assert.Error(t, err)
|
||||||
|
assert.Equal(t, test.expErr, err.Error())
|
||||||
|
} else if assert.NoError(t, err) {
|
||||||
|
assert.True(t,
|
||||||
|
test.exp.Value.Equal(got.Value),
|
||||||
|
"\nexp:%v\ngot:%v", test.exp, got,
|
||||||
|
)
|
||||||
|
assert.Equal(t, test.exp.Location, got.Location)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
expNum := func(row, col int, n int64) Located[Value] {
|
||||||
|
return Locate(Location{Row: row, Col: col}, Number(n))
|
||||||
|
}
|
||||||
|
|
||||||
|
runTests(t, "number", number, []test{
|
||||||
|
{in: `0`, exp: expNum(1, 1, 0)},
|
||||||
|
{in: `100`, exp: expNum(1, 1, 100)},
|
||||||
|
{in: `-100`, exp: expNum(1, 1, -100)},
|
||||||
|
{in: `0foo`, exp: expNum(1, 1, 0)},
|
||||||
|
{in: `100foo`, exp: expNum(1, 1, 100)},
|
||||||
|
})
|
||||||
|
|
||||||
|
expName := func(row, col int, name string) Located[Value] {
|
||||||
|
return Locate(Location{Row: row, Col: col}, Name(name))
|
||||||
|
}
|
||||||
|
|
||||||
|
expGraph := func(row, col int, g *Graph) Located[Value] {
|
||||||
|
return Locate(Location{Row: row, Col: col}, Value{Graph: g})
|
||||||
|
}
|
||||||
|
|
||||||
|
runTests(t, "name", name, []test{
|
||||||
|
{in: `a`, exp: expName(1, 1, "a")},
|
||||||
|
{in: `ab`, exp: expName(1, 1, "ab")},
|
||||||
|
{in: `ab2c`, exp: expName(1, 1, "ab2c")},
|
||||||
|
{in: `ab2c,`, exp: expName(1, 1, "ab2c")},
|
||||||
|
{in: `!ab2c,`, exp: expName(1, 1, "!ab2c")},
|
||||||
|
})
|
||||||
|
|
||||||
|
runTests(t, "graph", graphSym, []test{
|
||||||
|
{in: `{}`, exp: expGraph(1, 1, new(Graph))},
|
||||||
|
{in: `{`, expErr: `1:2: expected '}' or name`},
|
||||||
|
{in: `{a}`, expErr: `1:3: expected '='`},
|
||||||
|
{in: `{a=}`, expErr: `1:4: expected name or number or graph or tuple`},
|
||||||
|
{
|
||||||
|
in: `{foo=a}`,
|
||||||
|
exp: expGraph(
|
||||||
|
1, 1, new(Graph).
|
||||||
|
AddValueIn(Name("foo"), graph.ValueOut(None, Name("a"))),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
in: `{ foo = a }`,
|
||||||
|
exp: expGraph(
|
||||||
|
1, 1, new(Graph).
|
||||||
|
AddValueIn(Name("foo"), graph.ValueOut(None, Name("a"))),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{in: `{1=a}`, expErr: `1:2: expected '}' or name`},
|
||||||
|
{in: `{foo=a ,}`, expErr: `1:8: expected '}' or ';' or '<'`},
|
||||||
|
{in: `{foo=a`, expErr: `1:7: expected '}' or ';' or '<'`},
|
||||||
|
{
|
||||||
|
in: `{foo=a<b}`,
|
||||||
|
exp: expGraph(
|
||||||
|
1, 1, new(Graph).
|
||||||
|
AddValueIn(
|
||||||
|
Name("foo"),
|
||||||
|
graph.ValueOut(Some(Name("a")), Name("b")),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
in: `{foo=a< b <c}`,
|
||||||
|
exp: expGraph(
|
||||||
|
1, 1, new(Graph).
|
||||||
|
AddValueIn(
|
||||||
|
Name("foo"),
|
||||||
|
graph.TupleOut(
|
||||||
|
Some(Name("a")),
|
||||||
|
graph.ValueOut(
|
||||||
|
Some(Name("b")),
|
||||||
|
Name("c"),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
in: `{foo =a<b<c<1 }`,
|
||||||
|
exp: expGraph(
|
||||||
|
1, 1, new(Graph).
|
||||||
|
AddValueIn(
|
||||||
|
Name("foo"),
|
||||||
|
graph.TupleOut(
|
||||||
|
Some(Name("a")),
|
||||||
|
graph.TupleOut(
|
||||||
|
Some(Name("b")),
|
||||||
|
graph.ValueOut(
|
||||||
|
Some(Name("c")),
|
||||||
|
Number(1),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
in: `{foo=a<b ; }`,
|
||||||
|
exp: expGraph(
|
||||||
|
1, 1, new(Graph).
|
||||||
|
AddValueIn(
|
||||||
|
Name("foo"),
|
||||||
|
graph.ValueOut(
|
||||||
|
Some(Name("a")),
|
||||||
|
Name("b"),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
in: `{foo=a<b;bar=c}`,
|
||||||
|
exp: expGraph(
|
||||||
|
1, 1, new(Graph).
|
||||||
|
AddValueIn(
|
||||||
|
Name("foo"),
|
||||||
|
graph.ValueOut(
|
||||||
|
Some(Name("a")),
|
||||||
|
Name("b"),
|
||||||
|
),
|
||||||
|
).
|
||||||
|
AddValueIn(
|
||||||
|
Name("bar"),
|
||||||
|
graph.ValueOut(None, Name("c")),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
in: `{foo= a<{ baz=1 } ; bar=c}`,
|
||||||
|
exp: expGraph(
|
||||||
|
1, 1, new(Graph).
|
||||||
|
AddValueIn(
|
||||||
|
Name("foo"),
|
||||||
|
graph.ValueOut(
|
||||||
|
Some(Name("a")),
|
||||||
|
Value{Graph: new(Graph).AddValueIn(
|
||||||
|
Name("baz"),
|
||||||
|
graph.ValueOut(None, Number(1)),
|
||||||
|
)},
|
||||||
|
),
|
||||||
|
).
|
||||||
|
AddValueIn(
|
||||||
|
Name("bar"),
|
||||||
|
graph.ValueOut(None, Name("c")),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
in: `{foo= {baz=1} <a; bar=c}`,
|
||||||
|
exp: expGraph(
|
||||||
|
1, 1, new(Graph).
|
||||||
|
AddValueIn(
|
||||||
|
Name("foo"),
|
||||||
|
graph.ValueOut(
|
||||||
|
Some(Value{Graph: new(Graph).AddValueIn(
|
||||||
|
Name("baz"),
|
||||||
|
graph.ValueOut(None, Number(1)),
|
||||||
|
)}),
|
||||||
|
Name("a"),
|
||||||
|
),
|
||||||
|
).
|
||||||
|
AddValueIn(
|
||||||
|
Name("bar"),
|
||||||
|
graph.ValueOut(None, Name("c")),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
runTests(t, "tuple", graphSym, []test{
|
||||||
|
{
|
||||||
|
in: `{foo=(a)}`,
|
||||||
|
exp: expGraph(
|
||||||
|
1, 1, new(Graph).
|
||||||
|
AddValueIn(
|
||||||
|
Name("foo"),
|
||||||
|
graph.ValueOut(None, Name("a")),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
in: `{foo=(a<b)}`,
|
||||||
|
exp: expGraph(
|
||||||
|
1, 1, new(Graph).
|
||||||
|
AddValueIn(
|
||||||
|
Name("foo"),
|
||||||
|
graph.ValueOut(
|
||||||
|
Some(Name("a")),
|
||||||
|
Name("b"),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
in: `{foo=a<(b)}`,
|
||||||
|
exp: expGraph(
|
||||||
|
1, 1, new(Graph).
|
||||||
|
AddValueIn(
|
||||||
|
Name("foo"),
|
||||||
|
graph.ValueOut(
|
||||||
|
Some(Name("a")),
|
||||||
|
Name("b"),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
in: `{foo=a<(b,c)}`,
|
||||||
|
exp: expGraph(
|
||||||
|
1, 1, new(Graph).
|
||||||
|
AddValueIn(
|
||||||
|
Name("foo"),
|
||||||
|
graph.TupleOut(
|
||||||
|
Some(Name("a")),
|
||||||
|
graph.ValueOut(None, Name("b")),
|
||||||
|
graph.ValueOut(None, Name("c")),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
in: `{foo=a<(b<c)}`,
|
||||||
|
exp: expGraph(
|
||||||
|
1, 1, new(Graph).
|
||||||
|
AddValueIn(
|
||||||
|
Name("foo"),
|
||||||
|
graph.TupleOut(
|
||||||
|
Some(Name("a")),
|
||||||
|
graph.TupleOut(
|
||||||
|
Some(Name("b")),
|
||||||
|
graph.ValueOut(None, Name("c")),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
in: `{foo=a<(b<(c))}`,
|
||||||
|
exp: expGraph(
|
||||||
|
1, 1, new(Graph).
|
||||||
|
AddValueIn(
|
||||||
|
Name("foo"),
|
||||||
|
graph.TupleOut(
|
||||||
|
Some(Name("a")),
|
||||||
|
graph.TupleOut(
|
||||||
|
Some(Name("b")),
|
||||||
|
graph.ValueOut(None, Name("c")),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
in: `{foo=a<(b<(c,d<1))}`,
|
||||||
|
exp: expGraph(
|
||||||
|
1, 1, new(Graph).
|
||||||
|
AddValueIn(
|
||||||
|
Name("foo"),
|
||||||
|
graph.TupleOut(
|
||||||
|
Some(Name("a")),
|
||||||
|
graph.TupleOut(
|
||||||
|
Some(Name("b")),
|
||||||
|
graph.ValueOut(None, Name("c")),
|
||||||
|
graph.ValueOut(
|
||||||
|
Some(Name("d")),
|
||||||
|
Number(1),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
in: `{foo=a<(b<( ( (c) ) ))}`,
|
||||||
|
exp: expGraph(
|
||||||
|
1, 1, new(Graph).
|
||||||
|
AddValueIn(
|
||||||
|
Name("foo"),
|
||||||
|
graph.TupleOut(
|
||||||
|
Some(Name("a")),
|
||||||
|
graph.TupleOut(
|
||||||
|
Some(Name("b")),
|
||||||
|
graph.ValueOut(None, Name("c")),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
runTests(t, "comment", graphSym, []test{
|
||||||
|
{
|
||||||
|
in: "*\n{}",
|
||||||
|
exp: expGraph(2, 1, new(Graph)),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
in: "* ignore me!\n{}",
|
||||||
|
exp: expGraph(2, 1, new(Graph)),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
in: "{* ignore me!\n}",
|
||||||
|
exp: expGraph(1, 1, new(Graph)),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
in: "{foo* ignore me!\n = a}",
|
||||||
|
exp: expGraph(
|
||||||
|
1, 1, new(Graph).
|
||||||
|
AddValueIn(
|
||||||
|
Name("foo"),
|
||||||
|
graph.ValueOut(None, Name("a")),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
in: "{foo = a* ignore me!\n}",
|
||||||
|
exp: expGraph(
|
||||||
|
1, 1, new(Graph).
|
||||||
|
AddValueIn(
|
||||||
|
Name("foo"),
|
||||||
|
graph.ValueOut(None, Name("a")),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
},
|
||||||
|
})
|
||||||
|
}
|
25
gg/gg.bnf
Normal file
25
gg/gg.bnf
Normal file
@ -0,0 +1,25 @@
|
|||||||
|
<digit> ::= "0" | "1" | "2" | "3" | "4" | "5" | "6" | "7" | "8" | "9"
|
||||||
|
<positive-number> ::= <digit>+
|
||||||
|
<negative-number> ::= "-" <positive-number>
|
||||||
|
<number> ::= <negative-number> | <positive-number>
|
||||||
|
|
||||||
|
<name-head> ::= <letter> | <mark> | "!"
|
||||||
|
<name-tail> ::= <name-head> | <digit>
|
||||||
|
<name> ::= <name-head> <name-tail>*
|
||||||
|
|
||||||
|
<tuple> ::= "(" <tuple-tail>
|
||||||
|
<tuple-tail> ::= ")" | <tuple-open-edge>
|
||||||
|
<tuple-open-edge> ::= <value> <tuple-open-edge-value-tail>
|
||||||
|
| <tuple> <tuple-open-edge-tail>
|
||||||
|
<tuple-open-edge-tail> ::= ")" | "," <tuple-tail>
|
||||||
|
<tuple-open-edge-value-tail> ::= <tuple-open-edge-tail> | "<" <tuple-open-edge>
|
||||||
|
|
||||||
|
<graph> ::= "{" <graph-tail>
|
||||||
|
<graph-tail> ::= "}" | <name> "=" <graph-open-edge>
|
||||||
|
<graph-open-edge> ::= <value> <graph-open-edge-value-tail>
|
||||||
|
| <tuple> <graph-open-edge-tail>
|
||||||
|
<graph-open-edge-tail> ::= "}" | ";" <graph-tail>
|
||||||
|
<graph-open-edge-value-tail> ::= <graph-open-edge-tail> | "<" <graph-open-edge>
|
||||||
|
|
||||||
|
<value> ::= <name> | <number> | <graph>
|
||||||
|
<gg> ::= <eof> | <value> <gg>
|
592
gg/gg.go
592
gg/gg.go
@ -1,554 +1,114 @@
|
|||||||
// Package gg implements ginger graph creation, traversal, and (de)serialization
|
// Package gg implements graph serialization to/from the gg text format.
|
||||||
package gg
|
package gg
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"crypto/rand"
|
|
||||||
"encoding/hex"
|
|
||||||
"fmt"
|
"fmt"
|
||||||
"strings"
|
|
||||||
|
"code.betamike.com/mediocregopher/ginger/graph"
|
||||||
)
|
)
|
||||||
|
|
||||||
// Value wraps a go value in a way such that it will be uniquely identified
|
// Type aliases for convenience
|
||||||
// within any Graph and between Graphs. Use NewValue to create a Value instance.
|
type (
|
||||||
// You can create an instance manually as long as ID is globally unique.
|
Graph = graph.Graph[OptionalValue, Value]
|
||||||
|
OpenEdge = graph.OpenEdge[OptionalValue, Value]
|
||||||
|
)
|
||||||
|
|
||||||
|
// Value represents a value which can be serialized by the gg text format.
|
||||||
type Value struct {
|
type Value struct {
|
||||||
ID string
|
// Only one of these fields may be set
|
||||||
V interface{}
|
Name *string
|
||||||
|
Number *int64
|
||||||
|
Graph *Graph
|
||||||
}
|
}
|
||||||
|
|
||||||
// NewValue returns a Value instance wrapping any go value. The Value returned
|
// Name returns a name Value.
|
||||||
// will be independent of the passed in go value. So if the same go value is
|
func Name(name string) Value {
|
||||||
// passed in twice then the two returned Value instances will be treated as
|
return Value{Name: &name}
|
||||||
// being different values by Graph.
|
|
||||||
func NewValue(V interface{}) Value {
|
|
||||||
b := make([]byte, 8)
|
|
||||||
if _, err := rand.Read(b); err != nil {
|
|
||||||
panic(err)
|
|
||||||
}
|
|
||||||
return Value{
|
|
||||||
ID: hex.EncodeToString(b),
|
|
||||||
V: V,
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// VertexType enumerates the different possible vertex types
|
// Number returns a number Value.
|
||||||
type VertexType string
|
func Number(n int64) Value {
|
||||||
|
return Value{Number: &n}
|
||||||
const (
|
|
||||||
// ValueVertex is a Vertex which contains exactly one value and has at least
|
|
||||||
// one edge (either input or output)
|
|
||||||
ValueVertex VertexType = "value"
|
|
||||||
|
|
||||||
// JunctionVertex is a Vertex which contains two or more in edges and
|
|
||||||
// exactly one out edge
|
|
||||||
JunctionVertex VertexType = "junction"
|
|
||||||
)
|
|
||||||
|
|
||||||
// Edge is a uni-directional connection between two vertices with an attribute
|
|
||||||
// value
|
|
||||||
type Edge struct {
|
|
||||||
From *Vertex
|
|
||||||
Value Value
|
|
||||||
To *Vertex
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Vertex is a vertex in a Graph. No fields should be modified directly, only
|
// Equal returns true if the passed in Value is equivalent, ignoring the
|
||||||
// through method calls
|
// LexerToken on either Value.
|
||||||
type Vertex struct {
|
|
||||||
ID string
|
|
||||||
VertexType
|
|
||||||
Value Value // Value is valid if-and-only-if VertexType is ValueVertex
|
|
||||||
In, Out []Edge
|
|
||||||
}
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
|
|
||||||
// OpenEdge is an un-realized Edge which can't be used for anything except
|
|
||||||
// constructing graphs. It has no meaning on its own.
|
|
||||||
type OpenEdge struct {
|
|
||||||
// fromV will be the source vertex as-if the vertex (and any sub-vertices of
|
|
||||||
// it) doesn't already exist in the graph. If it or it's sub-vertices does
|
|
||||||
// already that will need to be taken into account when persisting into the
|
|
||||||
// graph
|
|
||||||
fromV vertex
|
|
||||||
val Value
|
|
||||||
}
|
|
||||||
|
|
||||||
func (oe OpenEdge) id() string {
|
|
||||||
return fmt.Sprintf("(%s,%s)", oe.fromV.id, oe.val.ID)
|
|
||||||
}
|
|
||||||
|
|
||||||
// vertex is a representation of a vertex in the graph. Each Graph contains a
|
|
||||||
// set of all the Value vertex instances it knows about. Each of these contains
|
|
||||||
// all the input OpenEdges which are known for it. So you can think of these
|
|
||||||
// "top-level" Value vertex instances as root nodes in a tree, and each OpenEdge
|
|
||||||
// as a branch.
|
|
||||||
//
|
//
|
||||||
// If a OpenEdge contains a fromV which is a Value that vertex won't have its in
|
// Will panic if the passed in v2 is not a Value from this package.
|
||||||
// slice populated no matter what. If fromV is a Junction it will be populated,
|
func (v Value) Equal(v2g graph.Value) bool {
|
||||||
// with any sub-Value's not being populated and so-on recursively
|
|
||||||
//
|
|
||||||
// When a view is constructed in makeView these Value instances are deduplicated
|
|
||||||
// and the top-level one's in value is used to properly connect it.
|
|
||||||
type vertex struct {
|
|
||||||
id string
|
|
||||||
VertexType
|
|
||||||
val Value
|
|
||||||
in []OpenEdge
|
|
||||||
}
|
|
||||||
|
|
||||||
func (v vertex) cp() vertex {
|
v2 := v2g.(Value)
|
||||||
cp := v
|
|
||||||
cp.in = make([]OpenEdge, len(v.in))
|
|
||||||
copy(cp.in, v.in)
|
|
||||||
return cp
|
|
||||||
}
|
|
||||||
|
|
||||||
func (v vertex) hasOpenEdge(oe OpenEdge) bool {
|
switch {
|
||||||
oeID := oe.id()
|
|
||||||
for _, in := range v.in {
|
case v.Name != nil && v2.Name != nil && *v.Name == *v2.Name:
|
||||||
if in.id() == oeID {
|
|
||||||
return true
|
return true
|
||||||
}
|
|
||||||
}
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
func (v vertex) cpAndDelOpenEdge(oe OpenEdge) (vertex, bool) {
|
case v.Number != nil && v2.Number != nil && *v.Number == *v2.Number:
|
||||||
oeID := oe.id()
|
return true
|
||||||
for i, in := range v.in {
|
|
||||||
if in.id() == oeID {
|
|
||||||
v = v.cp()
|
|
||||||
v.in = append(v.in[:i], v.in[i+1:]...)
|
|
||||||
return v, true
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return v, false
|
|
||||||
}
|
|
||||||
|
|
||||||
// Graph is a wrapper around a set of connected Vertices
|
case v.Graph != nil && v2.Graph != nil && v.Graph.Equal(v2.Graph):
|
||||||
type Graph struct {
|
return true
|
||||||
vM map[string]vertex // only contains value vertices
|
|
||||||
|
|
||||||
// generated by makeView on-demand
|
|
||||||
byVal map[string]*Vertex
|
|
||||||
all map[string]*Vertex
|
|
||||||
}
|
|
||||||
|
|
||||||
// Null is the root empty graph, and is the base off which all graphs are built
|
|
||||||
var Null = &Graph{
|
|
||||||
vM: map[string]vertex{},
|
|
||||||
byVal: map[string]*Vertex{},
|
|
||||||
all: map[string]*Vertex{},
|
|
||||||
}
|
|
||||||
|
|
||||||
// this does _not_ copy the view, as it's assumed the only reason to copy a
|
|
||||||
// graph is to modify it anyway
|
|
||||||
func (g *Graph) cp() *Graph {
|
|
||||||
cp := &Graph{
|
|
||||||
vM: make(map[string]vertex, len(g.vM)),
|
|
||||||
}
|
|
||||||
for vID, v := range g.vM {
|
|
||||||
cp.vM[vID] = v
|
|
||||||
}
|
|
||||||
return cp
|
|
||||||
}
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
// Graph creation
|
|
||||||
|
|
||||||
func mkVertex(typ VertexType, val Value, ins ...OpenEdge) vertex {
|
|
||||||
v := vertex{VertexType: typ, in: ins}
|
|
||||||
switch typ {
|
|
||||||
case ValueVertex:
|
|
||||||
v.id = val.ID
|
|
||||||
v.val = val
|
|
||||||
case JunctionVertex:
|
|
||||||
inIDs := make([]string, len(ins))
|
|
||||||
for i := range ins {
|
|
||||||
inIDs[i] = ins[i].id()
|
|
||||||
}
|
|
||||||
v.id = "[" + strings.Join(inIDs, ",") + "]"
|
|
||||||
default:
|
default:
|
||||||
panic(fmt.Sprintf("unknown vertex type %q", typ))
|
return false
|
||||||
}
|
|
||||||
return v
|
|
||||||
}
|
|
||||||
|
|
||||||
// ValueOut creates a OpenEdge which, when used to construct a Graph, represents
|
|
||||||
// an edge (with edgeVal attached to it) coming from the ValueVertex containing
|
|
||||||
// val.
|
|
||||||
//
|
|
||||||
// When constructing Graphs, Value vertices are de-duplicated on their Value. So
|
|
||||||
// multiple ValueOut OpenEdges constructed with the same val will be leaving the
|
|
||||||
// same Vertex instance in the constructed Graph.
|
|
||||||
func ValueOut(val, edgeVal Value) OpenEdge {
|
|
||||||
return OpenEdge{fromV: mkVertex(ValueVertex, val), val: edgeVal}
|
|
||||||
}
|
|
||||||
|
|
||||||
// JunctionOut creates a OpenEdge which, when used to construct a Graph,
|
|
||||||
// represents an edge (with edgeVal attached to it) coming from the
|
|
||||||
// JunctionVertex comprised of the given ordered-set of input edges.
|
|
||||||
//
|
|
||||||
// When constructing Graphs Junction vertices are de-duplicated on their input
|
|
||||||
// edges. So multiple Junction OpenEdges constructed with the same set of input
|
|
||||||
// edges will be leaving the same Junction instance in the constructed Graph.
|
|
||||||
func JunctionOut(ins []OpenEdge, edgeVal Value) OpenEdge {
|
|
||||||
return OpenEdge{
|
|
||||||
fromV: mkVertex(JunctionVertex, Value{}, ins...),
|
|
||||||
val: edgeVal,
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// AddValueIn takes a OpenEdge and connects it to the Value Vertex containing
|
func (v Value) String() string {
|
||||||
// val, returning the new Graph which reflects that connection. Any Vertices
|
|
||||||
// referenced within toe OpenEdge which do not yet exist in the Graph will also
|
|
||||||
// be created in this step.
|
|
||||||
func (g *Graph) AddValueIn(oe OpenEdge, val Value) *Graph {
|
|
||||||
to := mkVertex(ValueVertex, val)
|
|
||||||
toID := to.id
|
|
||||||
|
|
||||||
// if to is already in the graph, pull it out, as it might have existing in
|
switch {
|
||||||
// edges we want to keep
|
|
||||||
if exTo, ok := g.vM[toID]; ok {
|
case v.Name != nil:
|
||||||
to = exTo
|
return *v.Name
|
||||||
|
|
||||||
|
case v.Number != nil:
|
||||||
|
return fmt.Sprint(*v.Number)
|
||||||
|
|
||||||
|
case v.Graph != nil:
|
||||||
|
return v.Graph.String()
|
||||||
|
|
||||||
|
default:
|
||||||
|
panic("no fields set on Value")
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// if the incoming edge already exists in to then there's nothing to do
|
// OptionalValue is a Value which may be unset. This is used for edge values,
|
||||||
if to.hasOpenEdge(oe) {
|
// since edges might not have a value.
|
||||||
return g
|
type OptionalValue struct {
|
||||||
|
Value
|
||||||
|
Valid bool
|
||||||
|
}
|
||||||
|
|
||||||
|
// None is the zero OptionalValue (hello rustaceans).
|
||||||
|
var None OptionalValue
|
||||||
|
|
||||||
|
// Some wraps a Value to be an OptionalValue.
|
||||||
|
func Some(v Value) OptionalValue {
|
||||||
|
return OptionalValue{Valid: true, Value: v}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (v OptionalValue) String() string {
|
||||||
|
if !v.Valid {
|
||||||
|
return "<none>"
|
||||||
}
|
}
|
||||||
|
return v.Value.String()
|
||||||
|
}
|
||||||
|
|
||||||
to = to.cp()
|
func (v OptionalValue) Equal(v2g graph.Value) bool {
|
||||||
to.in = append(to.in, oe)
|
var v2 OptionalValue
|
||||||
g = g.cp()
|
|
||||||
|
|
||||||
// starting with to (which we always overwrite) go through vM and
|
if v2Val, ok := v2g.(Value); ok {
|
||||||
// recursively add in any vertices which aren't already there
|
v2 = Some(v2Val)
|
||||||
var persist func(vertex)
|
|
||||||
persist = func(v vertex) {
|
|
||||||
if v.VertexType == ValueVertex {
|
|
||||||
vID := v.id
|
|
||||||
if _, ok := g.vM[vID]; !ok {
|
|
||||||
g.vM[vID] = v
|
|
||||||
}
|
|
||||||
} else {
|
} else {
|
||||||
for _, e := range v.in {
|
v2 = v2g.(OptionalValue)
|
||||||
persist(e.fromV)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
delete(g.vM, toID)
|
|
||||||
persist(to)
|
|
||||||
for _, e := range to.in {
|
|
||||||
persist(e.fromV)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
return g
|
if v.Valid != v2.Valid {
|
||||||
}
|
|
||||||
|
|
||||||
// DelValueIn takes a OpenEdge and disconnects it from the Value Vertex
|
|
||||||
// containing val, returning the new Graph which reflects the disconnection. If
|
|
||||||
// the Value Vertex doesn't exist within the graph, or it doesn't have the given
|
|
||||||
// OpenEdge, no changes are made. Any vertices referenced by toe OpenEdge for
|
|
||||||
// which that edge is their only outgoing edge will be removed from the Graph.
|
|
||||||
func (g *Graph) DelValueIn(oe OpenEdge, val Value) *Graph {
|
|
||||||
to := mkVertex(ValueVertex, val)
|
|
||||||
toID := to.id
|
|
||||||
|
|
||||||
// pull to out of the graph. if it's not there then bail
|
|
||||||
var ok bool
|
|
||||||
if to, ok = g.vM[toID]; !ok {
|
|
||||||
return g
|
|
||||||
}
|
|
||||||
|
|
||||||
// get new copy of to without the half-edge, or return if the half-edge
|
|
||||||
// wasn't even in to
|
|
||||||
to, ok = to.cpAndDelOpenEdge(oe)
|
|
||||||
if !ok {
|
|
||||||
return g
|
|
||||||
}
|
|
||||||
g = g.cp()
|
|
||||||
g.vM[toID] = to
|
|
||||||
|
|
||||||
// connectedTo returns whether the vertex has any connections with the
|
|
||||||
// vertex of the given id, descending recursively
|
|
||||||
var connectedTo func(string, vertex) bool
|
|
||||||
connectedTo = func(vID string, curr vertex) bool {
|
|
||||||
for _, in := range curr.in {
|
|
||||||
if in.fromV.VertexType == ValueVertex && in.fromV.id == vID {
|
|
||||||
return true
|
|
||||||
} else if in.fromV.VertexType == JunctionVertex && connectedTo(vID, in.fromV) {
|
|
||||||
return true
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return false
|
return false
|
||||||
}
|
} else if !v.Valid {
|
||||||
|
|
||||||
// isOrphaned returns whether the given vertex has any connections to other
|
|
||||||
// nodes in the graph
|
|
||||||
isOrphaned := func(v vertex) bool {
|
|
||||||
vID := v.id
|
|
||||||
if v, ok := g.vM[vID]; ok && len(v.in) > 0 {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
for vID2, v2 := range g.vM {
|
|
||||||
if vID2 == vID {
|
|
||||||
continue
|
|
||||||
} else if connectedTo(vID, v2) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
|
|
||||||
// if to is orphaned get rid of it
|
return v.Value.Equal(v2.Value)
|
||||||
if isOrphaned(to) {
|
|
||||||
delete(g.vM, toID)
|
|
||||||
}
|
|
||||||
|
|
||||||
// rmOrphaned descends down the given OpenEdge and removes any Value
|
|
||||||
// Vertices referenced in it which are now orphaned
|
|
||||||
var rmOrphaned func(OpenEdge)
|
|
||||||
rmOrphaned = func(oe OpenEdge) {
|
|
||||||
if oe.fromV.VertexType == ValueVertex && isOrphaned(oe.fromV) {
|
|
||||||
delete(g.vM, oe.fromV.id)
|
|
||||||
} else if oe.fromV.VertexType == JunctionVertex {
|
|
||||||
for _, juncOe := range oe.fromV.in {
|
|
||||||
rmOrphaned(juncOe)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
rmOrphaned(oe)
|
|
||||||
|
|
||||||
return g
|
|
||||||
}
|
|
||||||
|
|
||||||
// Union takes in another Graph and returns a new one which is the union of the
|
|
||||||
// two. Value vertices which are shared between the two will be merged so that
|
|
||||||
// the new vertex has the input edges of both.
|
|
||||||
//
|
|
||||||
// TODO it bothers me that the opposite of Disjoin is Union and not "Join"
|
|
||||||
func (g *Graph) Union(g2 *Graph) *Graph {
|
|
||||||
g = g.cp()
|
|
||||||
for vID, v2 := range g2.vM {
|
|
||||||
v, ok := g.vM[vID]
|
|
||||||
if !ok {
|
|
||||||
v = v2
|
|
||||||
} else {
|
|
||||||
for _, v2e := range v2.in {
|
|
||||||
if !v.hasOpenEdge(v2e) {
|
|
||||||
v.in = append(v.in, v2e)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
g.vM[vID] = v
|
|
||||||
}
|
|
||||||
return g
|
|
||||||
}
|
|
||||||
|
|
||||||
// Disjoin splits the Graph into as many independently connected Graphs as it
|
|
||||||
// contains. Each Graph returned will have vertices connected only within itself
|
|
||||||
// and not across to the other Graphs, and the Union of all returned Graphs will
|
|
||||||
// be the original again.
|
|
||||||
//
|
|
||||||
// The order of the Graphs returned is not deterministic.
|
|
||||||
//
|
|
||||||
// Null.Disjoin() returns empty slice.
|
|
||||||
func (g *Graph) Disjoin() []*Graph {
|
|
||||||
m := map[string]*Graph{} // maps each id to the Graph it belongs to
|
|
||||||
mG := map[*Graph]struct{}{} // tracks unique Graphs created
|
|
||||||
|
|
||||||
var connectedTo func(vertex) *Graph
|
|
||||||
connectedTo = func(v vertex) *Graph {
|
|
||||||
if v.VertexType == ValueVertex {
|
|
||||||
if g := m[v.id]; g != nil {
|
|
||||||
return g
|
|
||||||
}
|
|
||||||
}
|
|
||||||
for _, oe := range v.in {
|
|
||||||
if g := connectedTo(oe.fromV); g != nil {
|
|
||||||
return g
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// used upon finding out that previously-thought-to-be disconnected vertices
|
|
||||||
// aren't. Merges the two graphs they're connected into together into one
|
|
||||||
// and updates all state internal to this function accordingly.
|
|
||||||
rejoin := func(gDst, gSrc *Graph) {
|
|
||||||
for id, v := range gSrc.vM {
|
|
||||||
gDst.vM[id] = v
|
|
||||||
m[id] = gDst
|
|
||||||
}
|
|
||||||
delete(mG, gSrc)
|
|
||||||
}
|
|
||||||
|
|
||||||
var connectTo func(vertex, *Graph)
|
|
||||||
connectTo = func(v vertex, g *Graph) {
|
|
||||||
if v.VertexType == ValueVertex {
|
|
||||||
if g2, ok := m[v.id]; ok && g != g2 {
|
|
||||||
rejoin(g, g2)
|
|
||||||
}
|
|
||||||
m[v.id] = g
|
|
||||||
}
|
|
||||||
for _, oe := range v.in {
|
|
||||||
connectTo(oe.fromV, g)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
for id, v := range g.vM {
|
|
||||||
gV := connectedTo(v)
|
|
||||||
|
|
||||||
// if gV is nil it means this vertex is part of a new Graph which
|
|
||||||
// nothing else has been connected to yet.
|
|
||||||
if gV == nil {
|
|
||||||
gV = Null.cp()
|
|
||||||
mG[gV] = struct{}{}
|
|
||||||
}
|
|
||||||
gV.vM[id] = v
|
|
||||||
|
|
||||||
// do this no matter what, because we want to descend in to the in edges
|
|
||||||
// and mark all of those as being part of this graph too
|
|
||||||
connectTo(v, gV)
|
|
||||||
}
|
|
||||||
|
|
||||||
gg := make([]*Graph, 0, len(mG))
|
|
||||||
for g := range mG {
|
|
||||||
gg = append(gg, g)
|
|
||||||
}
|
|
||||||
return gg
|
|
||||||
}
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
// Graph traversal
|
|
||||||
|
|
||||||
func (g *Graph) makeView() {
|
|
||||||
if g.byVal != nil {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
g.byVal = make(map[string]*Vertex, len(g.vM))
|
|
||||||
g.all = map[string]*Vertex{}
|
|
||||||
|
|
||||||
var getV func(vertex, bool) *Vertex
|
|
||||||
getV = func(v vertex, top bool) *Vertex {
|
|
||||||
V, ok := g.all[v.id]
|
|
||||||
if !ok {
|
|
||||||
V = &Vertex{ID: v.id, VertexType: v.VertexType, Value: v.val}
|
|
||||||
g.all[v.id] = V
|
|
||||||
}
|
|
||||||
|
|
||||||
// we can be sure all Value vertices will be called with top==true at
|
|
||||||
// some point, so we only need to descend into the input edges if:
|
|
||||||
// * top is true
|
|
||||||
// * this is a junction's first time being gotten
|
|
||||||
if !top && (ok || v.VertexType != JunctionVertex) {
|
|
||||||
return V
|
|
||||||
}
|
|
||||||
|
|
||||||
V.In = make([]Edge, 0, len(v.in))
|
|
||||||
for i := range v.in {
|
|
||||||
fromV := getV(v.in[i].fromV, false)
|
|
||||||
e := Edge{From: fromV, Value: v.in[i].val, To: V}
|
|
||||||
fromV.Out = append(fromV.Out, e)
|
|
||||||
V.In = append(V.In, e)
|
|
||||||
}
|
|
||||||
|
|
||||||
if v.VertexType == ValueVertex {
|
|
||||||
g.byVal[v.val.ID] = V
|
|
||||||
}
|
|
||||||
|
|
||||||
return V
|
|
||||||
}
|
|
||||||
|
|
||||||
for _, v := range g.vM {
|
|
||||||
getV(v, true)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// ValueVertex returns the Value Vertex for the given value. If the Graph
|
|
||||||
// doesn't contain a vertex for the value then nil is returned
|
|
||||||
func (g *Graph) ValueVertex(val Value) *Vertex {
|
|
||||||
g.makeView()
|
|
||||||
return g.byVal[val.ID]
|
|
||||||
}
|
|
||||||
|
|
||||||
// ValueVertices returns all Value Vertices in the Graph
|
|
||||||
func (g *Graph) ValueVertices() []*Vertex {
|
|
||||||
g.makeView()
|
|
||||||
vv := make([]*Vertex, 0, len(g.byVal))
|
|
||||||
for _, v := range g.byVal {
|
|
||||||
vv = append(vv, v)
|
|
||||||
}
|
|
||||||
return vv
|
|
||||||
}
|
|
||||||
|
|
||||||
// Equal returns whether or not the two Graphs are equivalent in value
|
|
||||||
func Equal(g1, g2 *Graph) bool {
|
|
||||||
if len(g1.vM) != len(g2.vM) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
for v1ID, v1 := range g1.vM {
|
|
||||||
v2, ok := g2.vM[v1ID]
|
|
||||||
if !ok {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
// since the vertices are values we must make sure their input sets are
|
|
||||||
// the same (which is tricky since they're unordered, unlike a
|
|
||||||
// junction's)
|
|
||||||
if len(v1.in) != len(v2.in) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
for _, in := range v1.in {
|
|
||||||
if !v2.hasOpenEdge(in) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return true
|
|
||||||
}
|
|
||||||
|
|
||||||
// TODO Walk, but by edge
|
|
||||||
// TODO Walk, but without end. AKA FSM
|
|
||||||
|
|
||||||
// Iter will iterate through the Graph's vertices, calling the callback on every
|
|
||||||
// Vertex in the Graph once. The vertex order used is non-deterministic. If the
|
|
||||||
// callback returns false the iteration is stopped.
|
|
||||||
func (g *Graph) Iter(callback func(*Vertex) bool) {
|
|
||||||
g.makeView()
|
|
||||||
if len(g.byVal) == 0 {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
seen := make(map[*Vertex]bool, len(g.byVal))
|
|
||||||
var innerWalk func(*Vertex) bool
|
|
||||||
innerWalk = func(v *Vertex) bool {
|
|
||||||
if seen[v] {
|
|
||||||
return true
|
|
||||||
} else if !callback(v) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
seen[v] = true
|
|
||||||
for _, e := range v.In {
|
|
||||||
if !innerWalk(e.From) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return true
|
|
||||||
}
|
|
||||||
|
|
||||||
for _, v := range g.byVal {
|
|
||||||
if !innerWalk(v) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// ByID returns all vertices indexed by their ID field
|
|
||||||
func (g *Graph) ByID() map[string]*Vertex {
|
|
||||||
g.makeView()
|
|
||||||
return g.all
|
|
||||||
}
|
}
|
||||||
|
665
gg/gg_test.go
665
gg/gg_test.go
@ -1,665 +0,0 @@
|
|||||||
package gg
|
|
||||||
|
|
||||||
import (
|
|
||||||
"fmt"
|
|
||||||
"sort"
|
|
||||||
"strings"
|
|
||||||
. "testing"
|
|
||||||
|
|
||||||
"github.com/stretchr/testify/assert"
|
|
||||||
)
|
|
||||||
|
|
||||||
func edge(val Value, from *Vertex) Edge {
|
|
||||||
return Edge{Value: val, From: from}
|
|
||||||
}
|
|
||||||
|
|
||||||
func value(val Value, in ...Edge) *Vertex {
|
|
||||||
return &Vertex{
|
|
||||||
VertexType: ValueVertex,
|
|
||||||
Value: val,
|
|
||||||
In: in,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func junction(val Value, in ...Edge) Edge {
|
|
||||||
return Edge{
|
|
||||||
From: &Vertex{
|
|
||||||
VertexType: JunctionVertex,
|
|
||||||
In: in,
|
|
||||||
},
|
|
||||||
Value: val,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func assertVertexEqual(t *T, exp, got *Vertex, msgAndArgs ...interface{}) bool {
|
|
||||||
var assertInner func(*Vertex, *Vertex, map[*Vertex]bool) bool
|
|
||||||
assertInner = func(exp, got *Vertex, m map[*Vertex]bool) bool {
|
|
||||||
// if got is already in m then we've already looked at it
|
|
||||||
if m[got] {
|
|
||||||
return true
|
|
||||||
}
|
|
||||||
m[got] = true
|
|
||||||
|
|
||||||
assert.Equal(t, exp.VertexType, got.VertexType, msgAndArgs...)
|
|
||||||
assert.Equal(t, exp.Value, got.Value, msgAndArgs...)
|
|
||||||
if !assert.Len(t, got.In, len(exp.In), msgAndArgs...) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
for i := range exp.In {
|
|
||||||
assertInner(exp.In[i].From, got.In[i].From, m)
|
|
||||||
assert.Equal(t, exp.In[i].Value, got.In[i].Value, msgAndArgs...)
|
|
||||||
assert.Equal(t, got, got.In[i].To)
|
|
||||||
assert.Contains(t, got.In[i].From.Out, got.In[i])
|
|
||||||
}
|
|
||||||
return true
|
|
||||||
|
|
||||||
}
|
|
||||||
return assertInner(exp, got, map[*Vertex]bool{})
|
|
||||||
}
|
|
||||||
|
|
||||||
func assertIter(t *T, expVals, expJuncs int, g *Graph, msgAndArgs ...interface{}) {
|
|
||||||
seen := map[*Vertex]bool{}
|
|
||||||
var gotVals, gotJuncs int
|
|
||||||
g.Iter(func(v *Vertex) bool {
|
|
||||||
assert.NotContains(t, seen, v, msgAndArgs...)
|
|
||||||
seen[v] = true
|
|
||||||
if v.VertexType == ValueVertex {
|
|
||||||
gotVals++
|
|
||||||
} else {
|
|
||||||
gotJuncs++
|
|
||||||
}
|
|
||||||
return true
|
|
||||||
})
|
|
||||||
assert.Equal(t, expVals, gotVals, msgAndArgs...)
|
|
||||||
assert.Equal(t, expJuncs, gotJuncs, msgAndArgs...)
|
|
||||||
}
|
|
||||||
|
|
||||||
type graphTest struct {
|
|
||||||
name string
|
|
||||||
out func() *Graph
|
|
||||||
exp []*Vertex
|
|
||||||
numVals, numJuncs int
|
|
||||||
}
|
|
||||||
|
|
||||||
func mkTest(name string, out func() *Graph, numVals, numJuncs int, exp ...*Vertex) graphTest {
|
|
||||||
return graphTest{
|
|
||||||
name: name,
|
|
||||||
out: out,
|
|
||||||
exp: exp,
|
|
||||||
numVals: numVals, numJuncs: numJuncs,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func TestGraph(t *T) {
|
|
||||||
var (
|
|
||||||
v0 = NewValue("v0")
|
|
||||||
v1 = NewValue("v1")
|
|
||||||
v2 = NewValue("v2")
|
|
||||||
v3 = NewValue("v3")
|
|
||||||
e0 = NewValue("e0")
|
|
||||||
e00 = NewValue("e00")
|
|
||||||
e01 = NewValue("e01")
|
|
||||||
e1 = NewValue("e1")
|
|
||||||
e10 = NewValue("e10")
|
|
||||||
e11 = NewValue("e11")
|
|
||||||
e2 = NewValue("e2")
|
|
||||||
e20 = NewValue("e20")
|
|
||||||
e21 = NewValue("e21")
|
|
||||||
ej0 = NewValue("ej0")
|
|
||||||
ej1 = NewValue("ej1")
|
|
||||||
ej2 = NewValue("ej2")
|
|
||||||
)
|
|
||||||
tests := []graphTest{
|
|
||||||
mkTest(
|
|
||||||
"values-basic",
|
|
||||||
func() *Graph {
|
|
||||||
return Null.AddValueIn(ValueOut(v0, e0), v1)
|
|
||||||
},
|
|
||||||
2, 0,
|
|
||||||
value(v0),
|
|
||||||
value(v1, edge(e0, value(v0))),
|
|
||||||
),
|
|
||||||
|
|
||||||
mkTest(
|
|
||||||
"values-2edges",
|
|
||||||
func() *Graph {
|
|
||||||
g0 := Null.AddValueIn(ValueOut(v0, e0), v2)
|
|
||||||
return g0.AddValueIn(ValueOut(v1, e1), v2)
|
|
||||||
},
|
|
||||||
3, 0,
|
|
||||||
value(v0),
|
|
||||||
value(v1),
|
|
||||||
value(v2,
|
|
||||||
edge(e0, value(v0)),
|
|
||||||
edge(e1, value(v1)),
|
|
||||||
),
|
|
||||||
),
|
|
||||||
|
|
||||||
mkTest(
|
|
||||||
"values-separate",
|
|
||||||
func() *Graph {
|
|
||||||
g0 := Null.AddValueIn(ValueOut(v0, e0), v1)
|
|
||||||
return g0.AddValueIn(ValueOut(v2, e2), v3)
|
|
||||||
},
|
|
||||||
4, 0,
|
|
||||||
value(v0),
|
|
||||||
value(v1, edge(e0, value(v0))),
|
|
||||||
value(v2),
|
|
||||||
value(v3, edge(e2, value(v2))),
|
|
||||||
),
|
|
||||||
|
|
||||||
mkTest(
|
|
||||||
"values-circular",
|
|
||||||
func() *Graph {
|
|
||||||
return Null.AddValueIn(ValueOut(v0, e0), v0)
|
|
||||||
},
|
|
||||||
1, 0,
|
|
||||||
value(v0, edge(e0, value(v0))),
|
|
||||||
),
|
|
||||||
|
|
||||||
mkTest(
|
|
||||||
"values-circular2",
|
|
||||||
func() *Graph {
|
|
||||||
g0 := Null.AddValueIn(ValueOut(v0, e0), v1)
|
|
||||||
return g0.AddValueIn(ValueOut(v1, e1), v0)
|
|
||||||
},
|
|
||||||
2, 0,
|
|
||||||
value(v0, edge(e1, value(v1, edge(e0, value(v0))))),
|
|
||||||
value(v1, edge(e0, value(v0, edge(e1, value(v1))))),
|
|
||||||
),
|
|
||||||
|
|
||||||
mkTest(
|
|
||||||
"values-circular3",
|
|
||||||
func() *Graph {
|
|
||||||
g0 := Null.AddValueIn(ValueOut(v0, e0), v1)
|
|
||||||
g1 := g0.AddValueIn(ValueOut(v1, e1), v2)
|
|
||||||
return g1.AddValueIn(ValueOut(v2, e2), v1)
|
|
||||||
},
|
|
||||||
3, 0,
|
|
||||||
value(v0),
|
|
||||||
value(v1,
|
|
||||||
edge(e0, value(v0)),
|
|
||||||
edge(e2, value(v2, edge(e1, value(v1)))),
|
|
||||||
),
|
|
||||||
value(v2, edge(e1, value(v1,
|
|
||||||
edge(e0, value(v0)),
|
|
||||||
edge(e2, value(v2)),
|
|
||||||
))),
|
|
||||||
),
|
|
||||||
|
|
||||||
mkTest(
|
|
||||||
"junction-basic",
|
|
||||||
func() *Graph {
|
|
||||||
e0 := ValueOut(v0, e0)
|
|
||||||
e1 := ValueOut(v1, e1)
|
|
||||||
ej0 := JunctionOut([]OpenEdge{e0, e1}, ej0)
|
|
||||||
return Null.AddValueIn(ej0, v2)
|
|
||||||
},
|
|
||||||
3, 1,
|
|
||||||
value(v0), value(v1),
|
|
||||||
value(v2, junction(ej0,
|
|
||||||
edge(e0, value(v0)),
|
|
||||||
edge(e1, value(v1)),
|
|
||||||
)),
|
|
||||||
),
|
|
||||||
|
|
||||||
mkTest(
|
|
||||||
"junction-basic2",
|
|
||||||
func() *Graph {
|
|
||||||
e00 := ValueOut(v0, e00)
|
|
||||||
e10 := ValueOut(v1, e10)
|
|
||||||
ej0 := JunctionOut([]OpenEdge{e00, e10}, ej0)
|
|
||||||
e01 := ValueOut(v0, e01)
|
|
||||||
e11 := ValueOut(v1, e11)
|
|
||||||
ej1 := JunctionOut([]OpenEdge{e01, e11}, ej1)
|
|
||||||
ej2 := JunctionOut([]OpenEdge{ej0, ej1}, ej2)
|
|
||||||
return Null.AddValueIn(ej2, v2)
|
|
||||||
},
|
|
||||||
3, 3,
|
|
||||||
value(v0), value(v1),
|
|
||||||
value(v2, junction(ej2,
|
|
||||||
junction(ej0,
|
|
||||||
edge(e00, value(v0)),
|
|
||||||
edge(e10, value(v1)),
|
|
||||||
),
|
|
||||||
junction(ej1,
|
|
||||||
edge(e01, value(v0)),
|
|
||||||
edge(e11, value(v1)),
|
|
||||||
),
|
|
||||||
)),
|
|
||||||
),
|
|
||||||
|
|
||||||
mkTest(
|
|
||||||
"junction-circular",
|
|
||||||
func() *Graph {
|
|
||||||
e0 := ValueOut(v0, e0)
|
|
||||||
e1 := ValueOut(v1, e1)
|
|
||||||
ej0 := JunctionOut([]OpenEdge{e0, e1}, ej0)
|
|
||||||
g0 := Null.AddValueIn(ej0, v2)
|
|
||||||
e20 := ValueOut(v2, e20)
|
|
||||||
g1 := g0.AddValueIn(e20, v0)
|
|
||||||
e21 := ValueOut(v2, e21)
|
|
||||||
return g1.AddValueIn(e21, v1)
|
|
||||||
},
|
|
||||||
3, 1,
|
|
||||||
value(v0, edge(e20, value(v2, junction(ej0,
|
|
||||||
edge(e0, value(v0)),
|
|
||||||
edge(e1, value(v1, edge(e21, value(v2)))),
|
|
||||||
)))),
|
|
||||||
value(v1, edge(e21, value(v2, junction(ej0,
|
|
||||||
edge(e0, value(v0, edge(e20, value(v2)))),
|
|
||||||
edge(e1, value(v1)),
|
|
||||||
)))),
|
|
||||||
value(v2, junction(ej0,
|
|
||||||
edge(e0, value(v0, edge(e20, value(v2)))),
|
|
||||||
edge(e1, value(v1, edge(e21, value(v2)))),
|
|
||||||
)),
|
|
||||||
),
|
|
||||||
}
|
|
||||||
|
|
||||||
for i := range tests {
|
|
||||||
t.Logf("test[%d]:%q", i, tests[i].name)
|
|
||||||
out := tests[i].out()
|
|
||||||
for j, exp := range tests[i].exp {
|
|
||||||
msgAndArgs := []interface{}{
|
|
||||||
"tests[%d].name:%q exp[%d].val:%q",
|
|
||||||
i, tests[i].name, j, exp.Value.V.(string),
|
|
||||||
}
|
|
||||||
v := out.ValueVertex(exp.Value)
|
|
||||||
if !assert.NotNil(t, v, msgAndArgs...) {
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
assertVertexEqual(t, exp, v, msgAndArgs...)
|
|
||||||
}
|
|
||||||
|
|
||||||
msgAndArgs := []interface{}{
|
|
||||||
"tests[%d].name:%q",
|
|
||||||
i, tests[i].name,
|
|
||||||
}
|
|
||||||
|
|
||||||
// sanity check that graphs are equal to themselves
|
|
||||||
assert.True(t, Equal(out, out), msgAndArgs...)
|
|
||||||
|
|
||||||
// test the Iter method in here too
|
|
||||||
assertIter(t, tests[i].numVals, tests[i].numJuncs, out, msgAndArgs...)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func TestGraphImmutability(t *T) {
|
|
||||||
v0 := NewValue("v0")
|
|
||||||
v1 := NewValue("v1")
|
|
||||||
e0 := NewValue("e0")
|
|
||||||
oe0 := ValueOut(v0, e0)
|
|
||||||
g0 := Null.AddValueIn(oe0, v1)
|
|
||||||
assert.Nil(t, Null.ValueVertex(v0))
|
|
||||||
assert.Nil(t, Null.ValueVertex(v1))
|
|
||||||
assert.NotNil(t, g0.ValueVertex(v0))
|
|
||||||
assert.NotNil(t, g0.ValueVertex(v1))
|
|
||||||
|
|
||||||
// half-edges should be re-usable
|
|
||||||
v2 := NewValue("v2")
|
|
||||||
v3a, v3b := NewValue("v3a"), NewValue("v3b")
|
|
||||||
e1 := NewValue("e1")
|
|
||||||
oe1 := ValueOut(v2, e1)
|
|
||||||
g1a := g0.AddValueIn(oe1, v3a)
|
|
||||||
g1b := g0.AddValueIn(oe1, v3b)
|
|
||||||
assertVertexEqual(t, value(v3a, edge(e1, value(v2))), g1a.ValueVertex(v3a))
|
|
||||||
assert.Nil(t, g1a.ValueVertex(v3b))
|
|
||||||
assertVertexEqual(t, value(v3b, edge(e1, value(v2))), g1b.ValueVertex(v3b))
|
|
||||||
assert.Nil(t, g1b.ValueVertex(v3a))
|
|
||||||
|
|
||||||
// ... even re-usable twice in succession
|
|
||||||
v3 := NewValue("v3")
|
|
||||||
v4 := NewValue("v4")
|
|
||||||
g2 := g0.AddValueIn(oe1, v3).AddValueIn(oe1, v4)
|
|
||||||
assert.Nil(t, g2.ValueVertex(v3b))
|
|
||||||
assert.Nil(t, g2.ValueVertex(v3a))
|
|
||||||
assertVertexEqual(t, value(v3, edge(e1, value(v2))), g2.ValueVertex(v3))
|
|
||||||
assertVertexEqual(t, value(v4, edge(e1, value(v2))), g2.ValueVertex(v4))
|
|
||||||
}
|
|
||||||
|
|
||||||
func TestGraphDelValueIn(t *T) {
|
|
||||||
v0 := NewValue("v0")
|
|
||||||
v1 := NewValue("v1")
|
|
||||||
e0 := NewValue("e0")
|
|
||||||
{ // removing from null
|
|
||||||
g := Null.DelValueIn(ValueOut(v0, e0), v1)
|
|
||||||
assert.True(t, Equal(Null, g))
|
|
||||||
}
|
|
||||||
|
|
||||||
e1 := NewValue("e1")
|
|
||||||
{ // removing edge from vertex which doesn't have that edge
|
|
||||||
g0 := Null.AddValueIn(ValueOut(v0, e0), v1)
|
|
||||||
g1 := g0.DelValueIn(ValueOut(v0, e1), v1)
|
|
||||||
assert.True(t, Equal(g0, g1))
|
|
||||||
}
|
|
||||||
|
|
||||||
{ // removing only edge
|
|
||||||
oe := ValueOut(v0, e0)
|
|
||||||
g0 := Null.AddValueIn(oe, v1)
|
|
||||||
g1 := g0.DelValueIn(oe, v1)
|
|
||||||
assert.True(t, Equal(Null, g1))
|
|
||||||
}
|
|
||||||
|
|
||||||
ej0 := NewValue("ej0")
|
|
||||||
v2 := NewValue("v2")
|
|
||||||
{ // removing only edge (junction)
|
|
||||||
oe := JunctionOut([]OpenEdge{
|
|
||||||
ValueOut(v0, e0),
|
|
||||||
ValueOut(v1, e1),
|
|
||||||
}, ej0)
|
|
||||||
g0 := Null.AddValueIn(oe, v2)
|
|
||||||
g1 := g0.DelValueIn(oe, v2)
|
|
||||||
assert.True(t, Equal(Null, g1))
|
|
||||||
}
|
|
||||||
|
|
||||||
{ // removing one of two edges
|
|
||||||
oe := ValueOut(v1, e0)
|
|
||||||
g0 := Null.AddValueIn(ValueOut(v0, e0), v2)
|
|
||||||
g1 := g0.AddValueIn(oe, v2)
|
|
||||||
g2 := g1.DelValueIn(oe, v2)
|
|
||||||
assert.True(t, Equal(g0, g2))
|
|
||||||
assert.NotNil(t, g2.ValueVertex(v0))
|
|
||||||
assert.Nil(t, g2.ValueVertex(v1))
|
|
||||||
assert.NotNil(t, g2.ValueVertex(v2))
|
|
||||||
}
|
|
||||||
|
|
||||||
e2 := NewValue("e2")
|
|
||||||
eja, ejb := NewValue("eja"), NewValue("ejb")
|
|
||||||
v3 := NewValue("v3")
|
|
||||||
{ // removing one of two edges (junction)
|
|
||||||
e0 := ValueOut(v0, e0)
|
|
||||||
e1 := ValueOut(v1, e1)
|
|
||||||
e2 := ValueOut(v2, e2)
|
|
||||||
oeA := JunctionOut([]OpenEdge{e0, e1}, eja)
|
|
||||||
oeB := JunctionOut([]OpenEdge{e1, e2}, ejb)
|
|
||||||
g0a := Null.AddValueIn(oeA, v3)
|
|
||||||
g0b := Null.AddValueIn(oeB, v3)
|
|
||||||
g1 := g0a.Union(g0b).DelValueIn(oeA, v3)
|
|
||||||
assert.True(t, Equal(g1, g0b))
|
|
||||||
assert.Nil(t, g1.ValueVertex(v0))
|
|
||||||
assert.NotNil(t, g1.ValueVertex(v1))
|
|
||||||
assert.NotNil(t, g1.ValueVertex(v2))
|
|
||||||
assert.NotNil(t, g1.ValueVertex(v3))
|
|
||||||
}
|
|
||||||
|
|
||||||
{ // removing one of two edges in circular graph
|
|
||||||
e0 := ValueOut(v0, e0)
|
|
||||||
e1 := ValueOut(v1, e1)
|
|
||||||
g0 := Null.AddValueIn(e0, v1).AddValueIn(e1, v0)
|
|
||||||
g1 := g0.DelValueIn(e0, v1)
|
|
||||||
assert.True(t, Equal(Null.AddValueIn(e1, v0), g1))
|
|
||||||
assert.NotNil(t, g1.ValueVertex(v0))
|
|
||||||
assert.NotNil(t, g1.ValueVertex(v1))
|
|
||||||
}
|
|
||||||
|
|
||||||
ej := NewValue("ej")
|
|
||||||
{ // removing to's only edge, sub-nodes have edge to each other
|
|
||||||
oej := JunctionOut([]OpenEdge{
|
|
||||||
ValueOut(v0, ej0),
|
|
||||||
ValueOut(v1, ej0),
|
|
||||||
}, ej)
|
|
||||||
g0 := Null.AddValueIn(oej, v2)
|
|
||||||
e0 := ValueOut(v0, e0)
|
|
||||||
g1 := g0.AddValueIn(e0, v1)
|
|
||||||
g2 := g1.DelValueIn(oej, v2)
|
|
||||||
assert.True(t, Equal(Null.AddValueIn(e0, v1), g2))
|
|
||||||
assert.NotNil(t, g2.ValueVertex(v0))
|
|
||||||
assert.NotNil(t, g2.ValueVertex(v1))
|
|
||||||
assert.Nil(t, g2.ValueVertex(v2))
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// deterministically hashes a Graph
|
|
||||||
func graphStr(g *Graph) string {
|
|
||||||
var vStr func(vertex) string
|
|
||||||
var oeStr func(OpenEdge) string
|
|
||||||
vStr = func(v vertex) string {
|
|
||||||
if v.VertexType == ValueVertex {
|
|
||||||
return fmt.Sprintf("v:%q\n", v.val.V.(string))
|
|
||||||
}
|
|
||||||
s := fmt.Sprintf("j:%d\n", len(v.in))
|
|
||||||
ssOE := make([]string, len(v.in))
|
|
||||||
for i := range v.in {
|
|
||||||
ssOE[i] = oeStr(v.in[i])
|
|
||||||
}
|
|
||||||
sort.Strings(ssOE)
|
|
||||||
return s + strings.Join(ssOE, "")
|
|
||||||
}
|
|
||||||
oeStr = func(oe OpenEdge) string {
|
|
||||||
s := fmt.Sprintf("oe:%q\n", oe.val.V.(string))
|
|
||||||
return s + vStr(oe.fromV)
|
|
||||||
}
|
|
||||||
sVV := make([]string, 0, len(g.vM))
|
|
||||||
for _, v := range g.vM {
|
|
||||||
sVV = append(sVV, vStr(v))
|
|
||||||
}
|
|
||||||
sort.Strings(sVV)
|
|
||||||
return strings.Join(sVV, "")
|
|
||||||
}
|
|
||||||
|
|
||||||
func assertEqualSets(t *T, exp, got []*Graph) bool {
|
|
||||||
if !assert.Equal(t, len(exp), len(got)) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
m := map[*Graph]string{}
|
|
||||||
for _, g := range exp {
|
|
||||||
m[g] = graphStr(g)
|
|
||||||
}
|
|
||||||
for _, g := range got {
|
|
||||||
m[g] = graphStr(g)
|
|
||||||
}
|
|
||||||
|
|
||||||
sort.Slice(exp, func(i, j int) bool {
|
|
||||||
return m[exp[i]] < m[exp[j]]
|
|
||||||
})
|
|
||||||
sort.Slice(got, func(i, j int) bool {
|
|
||||||
return m[got[i]] < m[got[j]]
|
|
||||||
})
|
|
||||||
|
|
||||||
b := true
|
|
||||||
for i := range exp {
|
|
||||||
b = b || assert.True(t, Equal(exp[i], got[i]), "i:%d exp:%q got:%q", i, m[exp[i]], m[got[i]])
|
|
||||||
}
|
|
||||||
return b
|
|
||||||
}
|
|
||||||
|
|
||||||
func TestGraphUnion(t *T) {
|
|
||||||
assertUnion := func(g1, g2 *Graph) *Graph {
|
|
||||||
ga := g1.Union(g2)
|
|
||||||
gb := g2.Union(g1)
|
|
||||||
assert.True(t, Equal(ga, gb))
|
|
||||||
return ga
|
|
||||||
}
|
|
||||||
|
|
||||||
assertDisjoin := func(g *Graph, exp ...*Graph) {
|
|
||||||
ggDisj := g.Disjoin()
|
|
||||||
assertEqualSets(t, exp, ggDisj)
|
|
||||||
}
|
|
||||||
|
|
||||||
v0 := NewValue("v0")
|
|
||||||
v1 := NewValue("v1")
|
|
||||||
e0 := NewValue("e0")
|
|
||||||
{ // Union with Null
|
|
||||||
assert.True(t, Equal(Null, Null.Union(Null)))
|
|
||||||
|
|
||||||
g := Null.AddValueIn(ValueOut(v0, e0), v1)
|
|
||||||
assert.True(t, Equal(g, assertUnion(g, Null)))
|
|
||||||
|
|
||||||
assertDisjoin(g, g)
|
|
||||||
}
|
|
||||||
|
|
||||||
v2 := NewValue("v2")
|
|
||||||
v3 := NewValue("v3")
|
|
||||||
e1 := NewValue("e1")
|
|
||||||
{ // Two disparate graphs union'd
|
|
||||||
g0 := Null.AddValueIn(ValueOut(v0, e0), v1)
|
|
||||||
g1 := Null.AddValueIn(ValueOut(v2, e1), v3)
|
|
||||||
g := assertUnion(g0, g1)
|
|
||||||
assertVertexEqual(t, value(v0), g.ValueVertex(v0))
|
|
||||||
assertVertexEqual(t, value(v1, edge(e0, value(v0))), g.ValueVertex(v1))
|
|
||||||
assertVertexEqual(t, value(v2), g.ValueVertex(v2))
|
|
||||||
assertVertexEqual(t, value(v3, edge(e1, value(v2))), g.ValueVertex(v3))
|
|
||||||
|
|
||||||
assertDisjoin(g, g0, g1)
|
|
||||||
}
|
|
||||||
|
|
||||||
va0, vb0 := NewValue("va0"), NewValue("vb0")
|
|
||||||
va1, vb1 := NewValue("va1"), NewValue("vb1")
|
|
||||||
va2, vb2 := NewValue("va2"), NewValue("vb2")
|
|
||||||
ea0, eb0 := NewValue("ea0"), NewValue("eb0")
|
|
||||||
ea1, eb1 := NewValue("ea1"), NewValue("eb1")
|
|
||||||
eaj, ebj := NewValue("eaj"), NewValue("ebj")
|
|
||||||
{ // Two disparate graphs with junctions
|
|
||||||
ga := Null.AddValueIn(JunctionOut([]OpenEdge{
|
|
||||||
ValueOut(va0, ea0),
|
|
||||||
ValueOut(va1, ea1),
|
|
||||||
}, eaj), va2)
|
|
||||||
gb := Null.AddValueIn(JunctionOut([]OpenEdge{
|
|
||||||
ValueOut(vb0, eb0),
|
|
||||||
ValueOut(vb1, eb1),
|
|
||||||
}, ebj), vb2)
|
|
||||||
g := assertUnion(ga, gb)
|
|
||||||
assertVertexEqual(t, value(va0), g.ValueVertex(va0))
|
|
||||||
assertVertexEqual(t, value(va1), g.ValueVertex(va1))
|
|
||||||
assertVertexEqual(t,
|
|
||||||
value(va2, junction(eaj,
|
|
||||||
edge(ea0, value(va0)),
|
|
||||||
edge(ea1, value(va1)))),
|
|
||||||
g.ValueVertex(va2),
|
|
||||||
)
|
|
||||||
assertVertexEqual(t, value(vb0), g.ValueVertex(vb0))
|
|
||||||
assertVertexEqual(t, value(vb1), g.ValueVertex(vb1))
|
|
||||||
assertVertexEqual(t,
|
|
||||||
value(vb2, junction(ebj,
|
|
||||||
edge(eb0, value(vb0)),
|
|
||||||
edge(eb1, value(vb1)))),
|
|
||||||
g.ValueVertex(vb2),
|
|
||||||
)
|
|
||||||
|
|
||||||
assertDisjoin(g, ga, gb)
|
|
||||||
}
|
|
||||||
|
|
||||||
{ // Two partially overlapping graphs
|
|
||||||
g0 := Null.AddValueIn(ValueOut(v0, e0), v2)
|
|
||||||
g1 := Null.AddValueIn(ValueOut(v1, e1), v2)
|
|
||||||
g := assertUnion(g0, g1)
|
|
||||||
assertVertexEqual(t, value(v0), g.ValueVertex(v0))
|
|
||||||
assertVertexEqual(t, value(v1), g.ValueVertex(v1))
|
|
||||||
assertVertexEqual(t,
|
|
||||||
value(v2,
|
|
||||||
edge(e0, value(v0)),
|
|
||||||
edge(e1, value(v1)),
|
|
||||||
),
|
|
||||||
g.ValueVertex(v2),
|
|
||||||
)
|
|
||||||
|
|
||||||
assertDisjoin(g, g)
|
|
||||||
}
|
|
||||||
|
|
||||||
ej0 := NewValue("ej0")
|
|
||||||
ej1 := NewValue("ej1")
|
|
||||||
{ // two partially overlapping graphs with junctions
|
|
||||||
g0 := Null.AddValueIn(JunctionOut([]OpenEdge{
|
|
||||||
ValueOut(v0, e0),
|
|
||||||
ValueOut(v1, e1),
|
|
||||||
}, ej0), v2)
|
|
||||||
g1 := Null.AddValueIn(JunctionOut([]OpenEdge{
|
|
||||||
ValueOut(v0, e0),
|
|
||||||
ValueOut(v1, e1),
|
|
||||||
}, ej1), v2)
|
|
||||||
g := assertUnion(g0, g1)
|
|
||||||
assertVertexEqual(t, value(v0), g.ValueVertex(v0))
|
|
||||||
assertVertexEqual(t, value(v1), g.ValueVertex(v1))
|
|
||||||
assertVertexEqual(t,
|
|
||||||
value(v2,
|
|
||||||
junction(ej0, edge(e0, value(v0)), edge(e1, value(v1))),
|
|
||||||
junction(ej1, edge(e0, value(v0)), edge(e1, value(v1))),
|
|
||||||
),
|
|
||||||
g.ValueVertex(v2),
|
|
||||||
)
|
|
||||||
|
|
||||||
assertDisjoin(g, g)
|
|
||||||
}
|
|
||||||
|
|
||||||
{ // Two equal graphs
|
|
||||||
g0 := Null.AddValueIn(ValueOut(v0, e0), v1)
|
|
||||||
g := assertUnion(g0, g0)
|
|
||||||
assertVertexEqual(t, value(v0), g.ValueVertex(v0))
|
|
||||||
assertVertexEqual(t,
|
|
||||||
value(v1, edge(e0, value(v0))),
|
|
||||||
g.ValueVertex(v1),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
{ // Two equal graphs with junctions
|
|
||||||
g0 := Null.AddValueIn(JunctionOut([]OpenEdge{
|
|
||||||
ValueOut(v0, e0),
|
|
||||||
ValueOut(v1, e1),
|
|
||||||
}, ej0), v2)
|
|
||||||
g := assertUnion(g0, g0)
|
|
||||||
assertVertexEqual(t, value(v0), g.ValueVertex(v0))
|
|
||||||
assertVertexEqual(t, value(v1), g.ValueVertex(v1))
|
|
||||||
assertVertexEqual(t,
|
|
||||||
value(v2,
|
|
||||||
junction(ej0, edge(e0, value(v0)), edge(e1, value(v1))),
|
|
||||||
),
|
|
||||||
g.ValueVertex(v2),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func TestGraphEqual(t *T) {
|
|
||||||
assertEqual := func(g1, g2 *Graph) {
|
|
||||||
assert.True(t, Equal(g1, g2))
|
|
||||||
assert.True(t, Equal(g2, g1))
|
|
||||||
}
|
|
||||||
|
|
||||||
assertNotEqual := func(g1, g2 *Graph) {
|
|
||||||
assert.False(t, Equal(g1, g2))
|
|
||||||
assert.False(t, Equal(g2, g1))
|
|
||||||
}
|
|
||||||
|
|
||||||
assertEqual(Null, Null) // duh
|
|
||||||
|
|
||||||
v0 := NewValue("v0")
|
|
||||||
v1 := NewValue("v1")
|
|
||||||
v2 := NewValue("v2")
|
|
||||||
e0 := NewValue("e0")
|
|
||||||
e1 := NewValue("e1")
|
|
||||||
e1a, e1b := NewValue("e1a"), NewValue("e1b")
|
|
||||||
{
|
|
||||||
// graph is equal to itself, not to null
|
|
||||||
e0 := ValueOut(v0, e0)
|
|
||||||
g0 := Null.AddValueIn(e0, v1)
|
|
||||||
assertNotEqual(g0, Null)
|
|
||||||
assertEqual(g0, g0)
|
|
||||||
|
|
||||||
// adding the an existing edge again shouldn't do anything
|
|
||||||
assertEqual(g0, g0.AddValueIn(e0, v1))
|
|
||||||
|
|
||||||
// g1a and g1b have the same vertices, but the edges are different
|
|
||||||
g1a := g0.AddValueIn(ValueOut(v0, e1a), v2)
|
|
||||||
g1b := g0.AddValueIn(ValueOut(v0, e1b), v2)
|
|
||||||
assertNotEqual(g1a, g1b)
|
|
||||||
}
|
|
||||||
|
|
||||||
{ // equal construction should yield equality, even if out of order
|
|
||||||
ga := Null.AddValueIn(ValueOut(v0, e0), v1)
|
|
||||||
ga = ga.AddValueIn(ValueOut(v1, e1), v2)
|
|
||||||
gb := Null.AddValueIn(ValueOut(v1, e1), v2)
|
|
||||||
gb = gb.AddValueIn(ValueOut(v0, e0), v1)
|
|
||||||
assertEqual(ga, gb)
|
|
||||||
}
|
|
||||||
|
|
||||||
ej := NewValue("ej")
|
|
||||||
{ // junction basic test
|
|
||||||
e0 := ValueOut(v0, e0)
|
|
||||||
e1 := ValueOut(v1, e1)
|
|
||||||
ga := Null.AddValueIn(JunctionOut([]OpenEdge{e0, e1}, ej), v2)
|
|
||||||
gb := Null.AddValueIn(JunctionOut([]OpenEdge{e1, e0}, ej), v2)
|
|
||||||
assertEqual(ga, ga)
|
|
||||||
assertNotEqual(ga, gb)
|
|
||||||
}
|
|
||||||
}
|
|
154
gg/grammar/example_test.go
Normal file
154
gg/grammar/example_test.go
Normal file
@ -0,0 +1,154 @@
|
|||||||
|
package grammar_test
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"fmt"
|
||||||
|
"strconv"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"code.betamike.com/mediocregopher/ginger/gg/grammar"
|
||||||
|
"golang.org/x/exp/slices"
|
||||||
|
)
|
||||||
|
|
||||||
|
/*
|
||||||
|
This example demonstrates how to describe the following EBNF using the grammar
|
||||||
|
package:
|
||||||
|
|
||||||
|
```
|
||||||
|
<digit> ::= "0" | "1" | "2" | "3" | "4" | "5" | "6" | "7" | "8" | "9"
|
||||||
|
<positive-number> ::= <digit>+
|
||||||
|
<negative-number> ::= "-" <positive-number>
|
||||||
|
<number> ::= <negative-number> | <positive-number>
|
||||||
|
|
||||||
|
<list-el-tail> ::= <element> <list-tail>
|
||||||
|
<list-tail> ::= ")" | "," <list-el-tail>
|
||||||
|
<list-head> ::= ")" | <list-el-tail>
|
||||||
|
<list> ::= "(" <list-head>
|
||||||
|
|
||||||
|
<element> ::= <number> | <list>
|
||||||
|
```
|
||||||
|
*/
|
||||||
|
|
||||||
|
// Element represents an element of a list, which can be either a number or a
|
||||||
|
// sub-list.
|
||||||
|
type Element struct {
|
||||||
|
Number int64
|
||||||
|
List []Element
|
||||||
|
}
|
||||||
|
|
||||||
|
func (e Element) String() string {
|
||||||
|
if e.List != nil {
|
||||||
|
listElStrs := make([]string, len(e.List))
|
||||||
|
for i := range e.List {
|
||||||
|
listElStrs[i] = e.List[i].String()
|
||||||
|
}
|
||||||
|
return fmt.Sprintf("(%s)", strings.Join(listElStrs, ","))
|
||||||
|
}
|
||||||
|
|
||||||
|
return fmt.Sprint(e.Number)
|
||||||
|
}
|
||||||
|
|
||||||
|
var (
|
||||||
|
digit = grammar.RuneFunc(
|
||||||
|
"digit", func(r rune) bool { return '0' <= r && r <= '9' },
|
||||||
|
)
|
||||||
|
|
||||||
|
positiveNumber = grammar.StringFromRunes(grammar.OneOrMore(digit))
|
||||||
|
|
||||||
|
negativeNumber = grammar.Reduction(
|
||||||
|
grammar.Rune('-'),
|
||||||
|
positiveNumber,
|
||||||
|
func(
|
||||||
|
neg grammar.Located[rune], posNum grammar.Located[string],
|
||||||
|
) grammar.Located[string] {
|
||||||
|
return grammar.Locate(neg.Location, string(neg.Value)+posNum.Value)
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
number = grammar.Named(
|
||||||
|
"number",
|
||||||
|
grammar.Mapping(
|
||||||
|
grammar.FirstOf(negativeNumber, positiveNumber),
|
||||||
|
func(str grammar.Located[string]) Element {
|
||||||
|
i, err := strconv.ParseInt(str.Value, 10, 64)
|
||||||
|
if err != nil {
|
||||||
|
panic(fmt.Errorf("parsing %q as int: %w", str, err))
|
||||||
|
}
|
||||||
|
return Element{Number: i}
|
||||||
|
},
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
// Because the list/element definitions are recursive it requires using
|
||||||
|
// SymbolPtrs, which is easier to do via a global initialization function
|
||||||
|
// like this.
|
||||||
|
list = func() grammar.Symbol[Element] {
|
||||||
|
|
||||||
|
type listState []Element
|
||||||
|
|
||||||
|
var (
|
||||||
|
listTail = new(grammar.SymbolPtr[listState])
|
||||||
|
list = new(grammar.SymbolPtr[Element])
|
||||||
|
element = new(grammar.SymbolPtr[Element])
|
||||||
|
|
||||||
|
// Right parenthesis indicates the end of a list, at which point we
|
||||||
|
// can initialize the state which gets returned down the stack.
|
||||||
|
listTerm = grammar.Mapping(
|
||||||
|
grammar.Rune(')'),
|
||||||
|
func(grammar.Located[rune]) listState { return listState{} },
|
||||||
|
)
|
||||||
|
|
||||||
|
listElTail = grammar.Reduction[Element, listState, listState](
|
||||||
|
element,
|
||||||
|
listTail,
|
||||||
|
func(el Element, ls listState) listState {
|
||||||
|
ls = append(ls, el)
|
||||||
|
return ls
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
listHead = grammar.FirstOf(listTerm, listElTail)
|
||||||
|
)
|
||||||
|
|
||||||
|
listTail.Symbol = grammar.FirstOf(
|
||||||
|
listTerm,
|
||||||
|
grammar.Prefixed(grammar.Rune(','), listElTail),
|
||||||
|
)
|
||||||
|
|
||||||
|
list.Symbol = grammar.Named(
|
||||||
|
"list",
|
||||||
|
grammar.Reduction[grammar.Located[rune], listState, Element](
|
||||||
|
grammar.Rune('('),
|
||||||
|
listHead,
|
||||||
|
func(_ grammar.Located[rune], ls listState) Element {
|
||||||
|
slices.Reverse(ls)
|
||||||
|
return Element{List: []Element(ls)}
|
||||||
|
},
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
element.Symbol = grammar.FirstOf[Element](number, list)
|
||||||
|
|
||||||
|
return list
|
||||||
|
}()
|
||||||
|
)
|
||||||
|
|
||||||
|
func Example() {
|
||||||
|
r := grammar.NewReader(bytes.NewBufferString(
|
||||||
|
`()` + `(1,(2,-3),4)` + `(ERROR`,
|
||||||
|
))
|
||||||
|
|
||||||
|
l1, err := list.Decode(r)
|
||||||
|
fmt.Println(l1, err)
|
||||||
|
|
||||||
|
l2, err := list.Decode(r)
|
||||||
|
fmt.Println(l2, err)
|
||||||
|
|
||||||
|
_, err = list.Decode(r)
|
||||||
|
fmt.Println(err)
|
||||||
|
|
||||||
|
// Output:
|
||||||
|
// () <nil>
|
||||||
|
// (1,(2,-3),4) <nil>
|
||||||
|
// 1:16: expected ')' or number or list
|
||||||
|
}
|
27
gg/grammar/grammar.go
Normal file
27
gg/grammar/grammar.go
Normal file
@ -0,0 +1,27 @@
|
|||||||
|
// Package grammar is used for parsing a stream of runes according to a set of
|
||||||
|
// grammatical rules. This package only supports context-free grammars.
|
||||||
|
package grammar
|
||||||
|
|
||||||
|
import "fmt"
|
||||||
|
|
||||||
|
// Stringer is a convenience tool for working with fmt.Stringer. Exactly one of
|
||||||
|
// the fields must be set, and will be used to implement the fmt.Stringer
|
||||||
|
// interface.
|
||||||
|
type Stringer struct {
|
||||||
|
I fmt.Stringer
|
||||||
|
F func() string
|
||||||
|
S string
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s Stringer) String() string {
|
||||||
|
switch {
|
||||||
|
case s.I != nil:
|
||||||
|
return s.I.String()
|
||||||
|
case s.F != nil:
|
||||||
|
return s.F()
|
||||||
|
case s.S != "":
|
||||||
|
return s.S
|
||||||
|
default:
|
||||||
|
panic("no fields set on Stringer")
|
||||||
|
}
|
||||||
|
}
|
32
gg/grammar/location.go
Normal file
32
gg/grammar/location.go
Normal file
@ -0,0 +1,32 @@
|
|||||||
|
package grammar
|
||||||
|
|
||||||
|
import "fmt"
|
||||||
|
|
||||||
|
// Location indicates a position in a stream of runes identified by column
|
||||||
|
// within newline-separated rows.
|
||||||
|
type Location struct {
|
||||||
|
Row, Col int
|
||||||
|
}
|
||||||
|
|
||||||
|
func (l Location) errf(str string, args ...any) LocatedError {
|
||||||
|
return LocatedError{l, fmt.Errorf(str, args...)}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Located wraps a value so that it has a Location attached to it.
|
||||||
|
type Located[T any] struct {
|
||||||
|
Location
|
||||||
|
Value T
|
||||||
|
}
|
||||||
|
|
||||||
|
// Locate returns a Located instance combining the given values.
|
||||||
|
func Locate[T any](l Location, v T) Located[T] {
|
||||||
|
return Located[T]{l, v}
|
||||||
|
}
|
||||||
|
|
||||||
|
// LocatedError is an error related to a specific point within a stream of
|
||||||
|
// runes.
|
||||||
|
type LocatedError Located[error]
|
||||||
|
|
||||||
|
func (e LocatedError) Error() string {
|
||||||
|
return fmt.Sprintf("%d:%d: %v", e.Row, e.Col, e.Value)
|
||||||
|
}
|
74
gg/grammar/reader.go
Normal file
74
gg/grammar/reader.go
Normal file
@ -0,0 +1,74 @@
|
|||||||
|
package grammar
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bufio"
|
||||||
|
"io"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Reader is used for reading Runes from a stream.
|
||||||
|
type Reader interface {
|
||||||
|
|
||||||
|
// ReadRune reads the next Rune off the stream, or returns io.EOF.
|
||||||
|
ReadRune() (Located[rune], error)
|
||||||
|
|
||||||
|
// UnreadRune can be used to place a Rune onto an internal buffer, such that
|
||||||
|
// the Rune will be the next to be read using ReadRune. If called multiple
|
||||||
|
// times then ReadRune will produce the given Runes in LIFO order.
|
||||||
|
UnreadRune(Located[rune])
|
||||||
|
|
||||||
|
// NextLocation returns the Location of the next Rune which will be returned
|
||||||
|
// with ReadRune.
|
||||||
|
NextLocation() Location
|
||||||
|
}
|
||||||
|
|
||||||
|
type reader struct {
|
||||||
|
br *bufio.Reader
|
||||||
|
brNextLoc Location
|
||||||
|
|
||||||
|
unread []Located[rune]
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewReader wraps the io.Reader as a Reader. The given Reader should not be
|
||||||
|
// read from after this call.
|
||||||
|
func NewReader(r io.Reader) Reader {
|
||||||
|
return &reader{
|
||||||
|
br: bufio.NewReader(r),
|
||||||
|
brNextLoc: Location{Row: 1, Col: 1},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (rr *reader) ReadRune() (Located[rune], error) {
|
||||||
|
if len(rr.unread) > 0 {
|
||||||
|
r := rr.unread[len(rr.unread)-1]
|
||||||
|
rr.unread = rr.unread[:len(rr.unread)-1]
|
||||||
|
return r, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
loc := rr.brNextLoc
|
||||||
|
|
||||||
|
r, _, err := rr.br.ReadRune()
|
||||||
|
if err != nil {
|
||||||
|
return Located[rune]{}, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if r == '\n' {
|
||||||
|
rr.brNextLoc.Row++
|
||||||
|
rr.brNextLoc.Col = 1
|
||||||
|
} else {
|
||||||
|
rr.brNextLoc.Col++
|
||||||
|
}
|
||||||
|
|
||||||
|
return Located[rune]{loc, r}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (rr *reader) UnreadRune(r Located[rune]) {
|
||||||
|
rr.unread = append(rr.unread, r)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (rr *reader) NextLocation() Location {
|
||||||
|
if len(rr.unread) > 0 {
|
||||||
|
return rr.unread[len(rr.unread)-1].Location
|
||||||
|
}
|
||||||
|
|
||||||
|
return rr.brNextLoc
|
||||||
|
}
|
306
gg/grammar/symbol.go
Normal file
306
gg/grammar/symbol.go
Normal file
@ -0,0 +1,306 @@
|
|||||||
|
package grammar
|
||||||
|
|
||||||
|
import (
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
// ErrNoMatch is used by Symbol's Decode method, see that method's docs for more
|
||||||
|
// details.
|
||||||
|
var ErrNoMatch = errors.New("no match")
|
||||||
|
|
||||||
|
// Symbol represents a symbol in the grammar. A Symbol is expected to be
|
||||||
|
// stateless, and is usually constructed from other Symbols using functions in
|
||||||
|
// this package.
|
||||||
|
type Symbol[T any] interface {
|
||||||
|
fmt.Stringer // Used when generating errors related to this Symbol, e.g. "number"
|
||||||
|
|
||||||
|
// Decode reads and parses a value represented by this Symbol off the
|
||||||
|
// Reader.
|
||||||
|
//
|
||||||
|
// This may return ErrNoMatch to indicate that the upcoming data on the
|
||||||
|
// Reader is rejected by this Symbol. In this case the Symbol should leave
|
||||||
|
// the Reader in the same state it was passed.
|
||||||
|
Decode(Reader) (T, error)
|
||||||
|
}
|
||||||
|
|
||||||
|
type symbol[T any] struct {
|
||||||
|
fmt.Stringer
|
||||||
|
decodeFn func(Reader) (T, error)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *symbol[T]) Decode(r Reader) (T, error) { return s.decodeFn(r) }
|
||||||
|
|
||||||
|
// SymbolPtr wraps a Symbol in a such a way as to make lazily initializing a
|
||||||
|
// Symbol variable possible. This allows for recursion amongst different
|
||||||
|
// Symbols.
|
||||||
|
//
|
||||||
|
// Example:
|
||||||
|
//
|
||||||
|
// a := new(SymbolPtr)
|
||||||
|
// b := new(SymbolPtr)
|
||||||
|
// a.Symbol = FirstOf(Rune('a'), b)
|
||||||
|
// b.Symbol = FirstOf(Rune('b'), a)
|
||||||
|
type SymbolPtr[T any] struct {
|
||||||
|
Symbol[T]
|
||||||
|
}
|
||||||
|
|
||||||
|
func named[T any](stringer fmt.Stringer, sym Symbol[T]) Symbol[T] {
|
||||||
|
return &symbol[T]{
|
||||||
|
stringer,
|
||||||
|
sym.Decode,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Named wraps the given Symbol such that its String method returns the given
|
||||||
|
// name.
|
||||||
|
func Named[T any](name string, sym Symbol[T]) Symbol[T] {
|
||||||
|
return named(Stringer{S: name}, sym)
|
||||||
|
}
|
||||||
|
|
||||||
|
// RuneFunc matches and produces any rune for which the given function returns
|
||||||
|
// true.
|
||||||
|
func RuneFunc(name string, fn func(rune) bool) Symbol[Located[rune]] {
|
||||||
|
return &symbol[Located[rune]]{
|
||||||
|
Stringer{S: name},
|
||||||
|
func(rr Reader) (Located[rune], error) {
|
||||||
|
var zero Located[rune]
|
||||||
|
|
||||||
|
r, err := rr.ReadRune()
|
||||||
|
if errors.Is(err, io.EOF) {
|
||||||
|
return zero, ErrNoMatch
|
||||||
|
} else if err != nil {
|
||||||
|
return zero, err
|
||||||
|
}
|
||||||
|
|
||||||
|
if !fn(r.Value) {
|
||||||
|
rr.UnreadRune(r)
|
||||||
|
return zero, ErrNoMatch
|
||||||
|
}
|
||||||
|
|
||||||
|
return r, nil
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Rune matches and produces the given rune.
|
||||||
|
func Rune(r rune) Symbol[Located[rune]] {
|
||||||
|
return RuneFunc(
|
||||||
|
fmt.Sprintf("'%c'", r),
|
||||||
|
func(r2 rune) bool { return r == r2 },
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
// StringFromRunes produces a string from the slice of runes produced by the
|
||||||
|
// given Symbol. The slice must not be empty. StringFromRunes does not match if
|
||||||
|
// the given Symbol does not match.
|
||||||
|
func StringFromRunes(sym Symbol[[]Located[rune]]) Symbol[Located[string]] {
|
||||||
|
return Mapping(sym, func(runes []Located[rune]) Located[string] {
|
||||||
|
if len(runes) == 0 {
|
||||||
|
panic("StringFromRunes used on empty set of runes")
|
||||||
|
}
|
||||||
|
|
||||||
|
str := make([]rune, len(runes))
|
||||||
|
for i := range runes {
|
||||||
|
str[i] = runes[i].Value
|
||||||
|
}
|
||||||
|
return Located[string]{runes[0].Location, string(str)}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Mapping produces a value of type Tb by decoding a value from the given
|
||||||
|
// Symbol and passing it through the given mapping function. If the given Symbol
|
||||||
|
// doesn't match then neither does Map.
|
||||||
|
func Mapping[Ta, Tb any](
|
||||||
|
sym Symbol[Ta], fn func(Ta) Tb,
|
||||||
|
) Symbol[Tb] {
|
||||||
|
return &symbol[Tb]{
|
||||||
|
sym,
|
||||||
|
func(rr Reader) (Tb, error) {
|
||||||
|
var zero Tb
|
||||||
|
va, err := sym.Decode(rr)
|
||||||
|
if err != nil {
|
||||||
|
return zero, err
|
||||||
|
}
|
||||||
|
return fn(va), nil
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// OneOrMore will produce as many of the given Symbol's value as can be found
|
||||||
|
// sequentially, up until a non-matching value is encountered. If no matches are
|
||||||
|
// found then OneOrMore does not match.
|
||||||
|
func OneOrMore[T any](sym Symbol[T]) Symbol[[]T] {
|
||||||
|
return &symbol[[]T]{
|
||||||
|
Stringer{F: func() string {
|
||||||
|
return fmt.Sprintf("one or more %v", sym)
|
||||||
|
}},
|
||||||
|
func(rr Reader) ([]T, error) {
|
||||||
|
var vv []T
|
||||||
|
for {
|
||||||
|
v, err := sym.Decode(rr)
|
||||||
|
if errors.Is(err, ErrNoMatch) {
|
||||||
|
break
|
||||||
|
} else if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
vv = append(vv, v)
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(vv) == 0 {
|
||||||
|
return nil, ErrNoMatch
|
||||||
|
}
|
||||||
|
|
||||||
|
return vv, nil
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ZeroOrMore will produce as many of the given Symbol's value as can be found
|
||||||
|
// sequentially, up until a non-matching value is encountered. If no matches are
|
||||||
|
// found then an empty slice is produced.
|
||||||
|
func ZeroOrMore[T any](sym Symbol[T]) Symbol[[]T] {
|
||||||
|
return &symbol[[]T]{
|
||||||
|
Stringer{F: func() string {
|
||||||
|
return fmt.Sprintf("zero or more %v", sym)
|
||||||
|
}},
|
||||||
|
func(rr Reader) ([]T, error) {
|
||||||
|
var vv []T
|
||||||
|
for {
|
||||||
|
v, err := sym.Decode(rr)
|
||||||
|
if errors.Is(err, ErrNoMatch) {
|
||||||
|
break
|
||||||
|
} else if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
vv = append(vv, v)
|
||||||
|
}
|
||||||
|
|
||||||
|
return vv, nil
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func firstOf[T any](stringer fmt.Stringer, syms ...Symbol[T]) Symbol[T] {
|
||||||
|
return &symbol[T]{
|
||||||
|
stringer,
|
||||||
|
func(rr Reader) (T, error) {
|
||||||
|
var zero T
|
||||||
|
for _, sym := range syms {
|
||||||
|
v, err := sym.Decode(rr)
|
||||||
|
if errors.Is(err, ErrNoMatch) {
|
||||||
|
continue
|
||||||
|
} else if err != nil {
|
||||||
|
return zero, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return v, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
return zero, ErrNoMatch
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// FirstOf matches and produces the value for the first Symbol in the list which
|
||||||
|
// matches. FirstOf does not match if none of the given Symbols match.
|
||||||
|
func FirstOf[T any](syms ...Symbol[T]) Symbol[T] {
|
||||||
|
return firstOf(
|
||||||
|
Stringer{F: func() string {
|
||||||
|
descrs := make([]string, len(syms))
|
||||||
|
for i := range syms {
|
||||||
|
descrs[i] = syms[i].String()
|
||||||
|
}
|
||||||
|
return strings.Join(descrs, " or ")
|
||||||
|
}},
|
||||||
|
syms...,
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Reduction produces a value of type Tc by first reading a value from symA,
|
||||||
|
// then symB, and then running those through the given function.
|
||||||
|
//
|
||||||
|
// If symA does not match then Reduction does not match. If symA matches but
|
||||||
|
// symB does not then also match then Reduction produces a LocatedError.
|
||||||
|
func Reduction[Ta, Tb, Tc any](
|
||||||
|
symA Symbol[Ta],
|
||||||
|
symB Symbol[Tb],
|
||||||
|
fn func(Ta, Tb) Tc,
|
||||||
|
) Symbol[Tc] {
|
||||||
|
return &symbol[Tc]{
|
||||||
|
symA,
|
||||||
|
func(rr Reader) (Tc, error) {
|
||||||
|
var zero Tc
|
||||||
|
|
||||||
|
va, err := symA.Decode(rr)
|
||||||
|
if err != nil {
|
||||||
|
return zero, err
|
||||||
|
}
|
||||||
|
|
||||||
|
vb, err := symB.Decode(rr)
|
||||||
|
if errors.Is(err, ErrNoMatch) {
|
||||||
|
return zero, rr.NextLocation().errf("expected %v", symB)
|
||||||
|
} else if err != nil {
|
||||||
|
return zero, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return fn(va, vb), nil
|
||||||
|
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Prefixed matches on prefixSym, discards its value, then produces the value
|
||||||
|
// produced by sym.
|
||||||
|
//
|
||||||
|
// If prefixSym does not match then Prefixed does not match. If prefixSym
|
||||||
|
// matches but sym does not also match then Prefixed produces a LocatedError.
|
||||||
|
func Prefixed[Ta, Tb any](prefixSym Symbol[Ta], sym Symbol[Tb]) Symbol[Tb] {
|
||||||
|
return named(prefixSym, Reduction(prefixSym, sym, func(_ Ta, b Tb) Tb {
|
||||||
|
return b
|
||||||
|
}))
|
||||||
|
}
|
||||||
|
|
||||||
|
// PrefixDiscarded is similar to Prefixed, except that if sym does not match
|
||||||
|
// then PrefixDiscarded does not match, whereas Prefixed produces a LocatedError
|
||||||
|
// in that case.
|
||||||
|
//
|
||||||
|
// NOTE PrefixDiscarded does not fully honor the contract of Symbol. If
|
||||||
|
// prefixSym matches, but sym does not, then only sym will restore Reader to its
|
||||||
|
// prior state; prefixSym cannot return whatever data it read back onto the
|
||||||
|
// Reader. Therefore ErrNoMatch can be returned without Reader being fully back
|
||||||
|
// in its original state. In practice this isn't a big deal, given the common
|
||||||
|
// use-cases of PrefixDiscarded, but it may prove tricky.
|
||||||
|
func PrefixDiscarded[Ta, Tb any](prefixSym Symbol[Ta], sym Symbol[Tb]) Symbol[Tb] {
|
||||||
|
return &symbol[Tb]{
|
||||||
|
sym,
|
||||||
|
func(rr Reader) (Tb, error) {
|
||||||
|
var zero Tb
|
||||||
|
if _, err := prefixSym.Decode(rr); err != nil {
|
||||||
|
return zero, err
|
||||||
|
}
|
||||||
|
return sym.Decode(rr)
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Suffixed matchs on sym and then suffixSym, returning the value produced by
|
||||||
|
// sym and discarding the one produced by suffixSym.
|
||||||
|
//
|
||||||
|
// If sym does not match then Suffixed does not match. If sym matches but
|
||||||
|
// suffixSym does not also match then Suffixed produces a LocatedError.
|
||||||
|
func Suffixed[Ta, Tb any](sym Symbol[Ta], suffixSym Symbol[Tb]) Symbol[Ta] {
|
||||||
|
return named(sym, Reduction(sym, suffixSym, func(a Ta, _ Tb) Ta {
|
||||||
|
return a
|
||||||
|
}))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Discard matches if the given Symbol does, but discards the value it produces,
|
||||||
|
// producing an empty value instead.
|
||||||
|
func Discard[T any](sym Symbol[T]) Symbol[struct{}] {
|
||||||
|
return Mapping(sym, func(T) struct{} { return struct{}{} })
|
||||||
|
}
|
186
gg/json.go
186
gg/json.go
@ -1,186 +0,0 @@
|
|||||||
package gg
|
|
||||||
|
|
||||||
import (
|
|
||||||
"encoding/json"
|
|
||||||
"fmt"
|
|
||||||
)
|
|
||||||
|
|
||||||
type openEdgeJSON struct {
|
|
||||||
From vertexJSON `json:"from"`
|
|
||||||
ValueID string `json:"valueID"`
|
|
||||||
}
|
|
||||||
|
|
||||||
type vertexJSON struct {
|
|
||||||
Type VertexType `json:"type"`
|
|
||||||
ValueID string `json:"valueID,omitempty"`
|
|
||||||
In []openEdgeJSON `json:"in"`
|
|
||||||
}
|
|
||||||
|
|
||||||
type graphJSON struct {
|
|
||||||
Values map[string]json.RawMessage `json:"values"`
|
|
||||||
ValueVertices []vertexJSON `json:"valueVertices"`
|
|
||||||
}
|
|
||||||
|
|
||||||
// MarshalJSON implements the json.Marshaler interface for a Graph. All Values
|
|
||||||
// in the Graph will have json.Marshal called on them as-is in order to marshal
|
|
||||||
// them.
|
|
||||||
func (g *Graph) MarshalJSON() ([]byte, error) {
|
|
||||||
gJ := graphJSON{
|
|
||||||
Values: map[string]json.RawMessage{},
|
|
||||||
ValueVertices: make([]vertexJSON, 0, len(g.vM)),
|
|
||||||
}
|
|
||||||
|
|
||||||
withVal := func(val Value) (string, error) {
|
|
||||||
if _, ok := gJ.Values[val.ID]; !ok {
|
|
||||||
valJ, err := json.Marshal(val.V)
|
|
||||||
if err != nil {
|
|
||||||
return "", err
|
|
||||||
}
|
|
||||||
gJ.Values[val.ID] = json.RawMessage(valJ)
|
|
||||||
}
|
|
||||||
return val.ID, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// two locally defined, mutually recursive functions. This kind of thing
|
|
||||||
// could probably be abstracted out, I feel like it happens frequently with
|
|
||||||
// graph code.
|
|
||||||
var mkIns func([]OpenEdge) ([]openEdgeJSON, error)
|
|
||||||
var mkVert func(vertex) (vertexJSON, error)
|
|
||||||
|
|
||||||
mkIns = func(in []OpenEdge) ([]openEdgeJSON, error) {
|
|
||||||
inJ := make([]openEdgeJSON, len(in))
|
|
||||||
for i := range in {
|
|
||||||
valID, err := withVal(in[i].val)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
vJ, err := mkVert(in[i].fromV)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
inJ[i] = openEdgeJSON{From: vJ, ValueID: valID}
|
|
||||||
}
|
|
||||||
return inJ, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
mkVert = func(v vertex) (vertexJSON, error) {
|
|
||||||
ins, err := mkIns(v.in)
|
|
||||||
if err != nil {
|
|
||||||
return vertexJSON{}, err
|
|
||||||
}
|
|
||||||
vJ := vertexJSON{
|
|
||||||
Type: v.VertexType,
|
|
||||||
In: ins,
|
|
||||||
}
|
|
||||||
if v.VertexType == ValueVertex {
|
|
||||||
valID, err := withVal(v.val)
|
|
||||||
if err != nil {
|
|
||||||
return vJ, err
|
|
||||||
}
|
|
||||||
vJ.ValueID = valID
|
|
||||||
}
|
|
||||||
return vJ, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
for _, v := range g.vM {
|
|
||||||
vJ, err := mkVert(v)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
gJ.ValueVertices = append(gJ.ValueVertices, vJ)
|
|
||||||
}
|
|
||||||
|
|
||||||
return json.Marshal(gJ)
|
|
||||||
}
|
|
||||||
|
|
||||||
type jsonUnmarshaler struct {
|
|
||||||
g *Graph
|
|
||||||
fn func(json.RawMessage) (interface{}, error)
|
|
||||||
}
|
|
||||||
|
|
||||||
// JSONUnmarshaler returns a json.Unmarshaler instance which, when used, will
|
|
||||||
// unmarshal a json string into the Graph instance being called on here.
|
|
||||||
//
|
|
||||||
// The passed in function is used to unmarshal Values (used in both ValueVertex
|
|
||||||
// vertices and edges) from json strings into go values. The returned inteface{}
|
|
||||||
// should have already had the unmarshal from the given json string performed on
|
|
||||||
// it.
|
|
||||||
//
|
|
||||||
// The json.Unmarshaler returned can be used many times, but will reset the
|
|
||||||
// Graph completely before each use.
|
|
||||||
func (g *Graph) JSONUnmarshaler(fn func(json.RawMessage) (interface{}, error)) json.Unmarshaler {
|
|
||||||
return jsonUnmarshaler{g: g, fn: fn}
|
|
||||||
}
|
|
||||||
|
|
||||||
func (jm jsonUnmarshaler) UnmarshalJSON(b []byte) error {
|
|
||||||
*(jm.g) = Graph{}
|
|
||||||
jm.g.vM = map[string]vertex{}
|
|
||||||
|
|
||||||
var gJ graphJSON
|
|
||||||
if err := json.Unmarshal(b, &gJ); err != nil {
|
|
||||||
return err
|
|
||||||
}
|
|
||||||
|
|
||||||
vals := map[string]Value{}
|
|
||||||
getVal := func(valID string) (Value, error) {
|
|
||||||
if val, ok := vals[valID]; ok {
|
|
||||||
return val, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
j, ok := gJ.Values[valID]
|
|
||||||
if !ok {
|
|
||||||
return Value{}, fmt.Errorf("unmarshaling malformed graph, value with ID %q not defined", valID)
|
|
||||||
}
|
|
||||||
|
|
||||||
V, err := jm.fn(j)
|
|
||||||
if err != nil {
|
|
||||||
return Value{}, err
|
|
||||||
}
|
|
||||||
|
|
||||||
val := Value{ID: valID, V: V}
|
|
||||||
vals[valID] = val
|
|
||||||
return val, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
var mkIns func([]openEdgeJSON) ([]OpenEdge, error)
|
|
||||||
var mkVert func(vertexJSON) (vertex, error)
|
|
||||||
|
|
||||||
mkIns = func(inJ []openEdgeJSON) ([]OpenEdge, error) {
|
|
||||||
in := make([]OpenEdge, len(inJ))
|
|
||||||
for i := range inJ {
|
|
||||||
val, err := getVal(inJ[i].ValueID)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
v, err := mkVert(inJ[i].From)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
in[i] = OpenEdge{fromV: v, val: val}
|
|
||||||
}
|
|
||||||
return in, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
mkVert = func(vJ vertexJSON) (vertex, error) {
|
|
||||||
ins, err := mkIns(vJ.In)
|
|
||||||
if err != nil {
|
|
||||||
return vertex{}, err
|
|
||||||
}
|
|
||||||
var val Value
|
|
||||||
if vJ.Type == ValueVertex {
|
|
||||||
if val, err = getVal(vJ.ValueID); err != nil {
|
|
||||||
return vertex{}, err
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return mkVertex(vJ.Type, val, ins...), nil
|
|
||||||
}
|
|
||||||
|
|
||||||
for _, v := range gJ.ValueVertices {
|
|
||||||
v, err := mkVert(v)
|
|
||||||
if err != nil {
|
|
||||||
return err
|
|
||||||
}
|
|
||||||
jm.g.vM[v.id] = v
|
|
||||||
}
|
|
||||||
return nil
|
|
||||||
}
|
|
111
gim/NOTES
111
gim/NOTES
@ -1,111 +0,0 @@
|
|||||||
Notes from reading https://www.graphviz.org/Documentation/TSE93.pdf, which
|
|
||||||
describes an algorithm for drawing an acyclic graph in basically the way which I
|
|
||||||
want.
|
|
||||||
|
|
||||||
This document assumes the primary flow of drawing is downward, and secondary is
|
|
||||||
right.
|
|
||||||
|
|
||||||
For all of this it might be easier to not even consider edge values yet, as
|
|
||||||
those could be done by converting them into vertices themselves after the
|
|
||||||
cyclic-edge-reversal and then converting them back later.
|
|
||||||
|
|
||||||
Drawing the graph is a four step process:
|
|
||||||
|
|
||||||
1) Rank nodes in the Y axis
|
|
||||||
- Graph must be acyclic.
|
|
||||||
- This can be accomplished by strategically reversing edges which cause
|
|
||||||
a cycle, and then reversing them back as a post-processing step.
|
|
||||||
- Edges can be found by:
|
|
||||||
- walking out from a particular node depth-first from some arbitrary
|
|
||||||
node.
|
|
||||||
- As you do so you assign a rank based on depth to each node you
|
|
||||||
encounter.
|
|
||||||
- If any edge is destined for a node which has already been seen you
|
|
||||||
look at the ranks of the source and destination, and if the source
|
|
||||||
is _greater_ than the destination you reverse the edge's
|
|
||||||
direction.
|
|
||||||
- I think that algorithm only works if there's a source/sink? might have
|
|
||||||
to be modified, or the walk must traverse both to & from.
|
|
||||||
- Assign all edges a weight, default 1, but possibly externally assigned to
|
|
||||||
be greater.
|
|
||||||
- Take a "feasible" minimum spanning tree (MST) of the graph
|
|
||||||
- Feasibility is defined as each edge being "tight", meaning, once you
|
|
||||||
rank each node by their distance from the root and define the length
|
|
||||||
of an edge as the difference of rank of its head and tail, that each
|
|
||||||
tree edge will have a length of 1.
|
|
||||||
- Perform the following on the MST:
|
|
||||||
- For each edge of the graph assign the cut value
|
|
||||||
- If you were to remove any edge of an MST it would create two
|
|
||||||
separate MSTs. The side the edge was pointing from is the tail,
|
|
||||||
the side it was pointing to is the head.
|
|
||||||
- Looking at edges _in the original graph_, sum the weights of all
|
|
||||||
edges directed from the tail to the head (including the one
|
|
||||||
removed) and subtract from that the sum of the weights of the
|
|
||||||
edges directed from the head to the tail. This is the cut value.
|
|
||||||
- "...note that the cut values can be computed using information
|
|
||||||
local to an edge if the search is ordered from the leaves of the
|
|
||||||
feasible tree inward. It is trivial to compute the cut value of a
|
|
||||||
tree edge with one of its endpoints a leaf in the tree, since
|
|
||||||
either the head or the tail component consists of a single node.
|
|
||||||
Now, assuming the cut values are known for all the edges incident
|
|
||||||
on a given node except one, the cut value of the remaining edge is
|
|
||||||
the sum of the known cut values plus a term dependent only on the
|
|
||||||
edges incident to the given node."
|
|
||||||
- Take an edge with a negative cut value and remove it. Find the graph
|
|
||||||
edge between the remaining head and tail MSTs with the smallest
|
|
||||||
"slack" (distance in rank between its ends) and add that edge to the
|
|
||||||
MST to make it connected again.
|
|
||||||
- Repeat until there are no negative cut values.
|
|
||||||
- Apparently searching "cyclically" through the negative edges, rather
|
|
||||||
than iterating from the start each time, is worthwhile.
|
|
||||||
- Normalize the MST by assigning the root node the rank of 0 (and so on), if
|
|
||||||
it changed.
|
|
||||||
- All edges in the MST are of length 1, and the rest can be inferred from
|
|
||||||
that.
|
|
||||||
- To reduce crowding, nodes with equal in/out edge weights and which could
|
|
||||||
be placed on multiple rankings are moved to the ranking with the fewest
|
|
||||||
nodes.
|
|
||||||
|
|
||||||
2) Order nodes in the X axis to reduce edge crossings
|
|
||||||
- Add ephemeral vertices along edges with lengths greater than 1, so all
|
|
||||||
"spaces" are filled.
|
|
||||||
- If any vertices have edges to vertices on their same rank, those are
|
|
||||||
ordered so that all these "flag edges" are pointed in the same direction
|
|
||||||
across that rank, and the ordering of those particular vertices is always
|
|
||||||
kept.
|
|
||||||
- Iterate over the graph some fixed number of times (the paper recommends
|
|
||||||
24)
|
|
||||||
- possibly with some heuristic which looks at percentage improvement
|
|
||||||
each time to determine if it's worth the effort.
|
|
||||||
- on one iteration move "down" the graph, on the next move "up", etc...
|
|
||||||
shaker style
|
|
||||||
- On each iteration:
|
|
||||||
- For each vertex look at the median position of all of the vertices
|
|
||||||
it has edges to in the previous rank
|
|
||||||
- If the number of previous vertices is even do this complicated
|
|
||||||
thing (P is the set of positions previous):
|
|
||||||
```
|
|
||||||
if |P| = 2 then
|
|
||||||
return (P[0] + P[1])/2;
|
|
||||||
else
|
|
||||||
left = P[m-1] - P[0];
|
|
||||||
right = P[|P| -1] - P[m];
|
|
||||||
return (P[m-1]*right + P[m]*left)/(left+right);
|
|
||||||
endif
|
|
||||||
```
|
|
||||||
- Sort the vertices by their median position
|
|
||||||
- vertices with no previous vertices remain fixed
|
|
||||||
- Then, for each vertex in the rank attempt to transpose it with its
|
|
||||||
neighbor and see if that reduces the number of edge crossings
|
|
||||||
between the rank and its previous.
|
|
||||||
- If equality is found during these two steps (same median, or same
|
|
||||||
number of crossings) the vertices in question should be flipped.
|
|
||||||
|
|
||||||
3) Compute node coordinates
|
|
||||||
- Determining the Y coordinates is considered trivial: find the maxHeight of
|
|
||||||
each rank, and ensure they are separated by that much plus whatever the
|
|
||||||
separation value is.
|
|
||||||
- For the X coordinates: do some insane shit involving the network simplex
|
|
||||||
again.
|
|
||||||
|
|
||||||
4) Determine edge splines
|
|
139
gim/geo/geo.go
139
gim/geo/geo.go
@ -1,139 +0,0 @@
|
|||||||
// Package geo implements basic geometric concepts used by gim
|
|
||||||
package geo
|
|
||||||
|
|
||||||
import "math"
|
|
||||||
|
|
||||||
// XY describes a 2-dimensional position or vector. The origin of the
|
|
||||||
// 2-dimensional space is a 0,0, with the x-axis going to the left and the
|
|
||||||
// y-axis going down.
|
|
||||||
type XY [2]int
|
|
||||||
|
|
||||||
// Zero is the zero point, or a zero vector, depending on what you're doing
|
|
||||||
var Zero = XY{0, 0}
|
|
||||||
|
|
||||||
// Unit vectors
|
|
||||||
var (
|
|
||||||
Up = XY{0, -1}
|
|
||||||
Down = XY{0, 1}
|
|
||||||
Left = XY{-1, 0}
|
|
||||||
Right = XY{1, 0}
|
|
||||||
)
|
|
||||||
|
|
||||||
// Units is the set of unit vectors
|
|
||||||
var Units = []XY{
|
|
||||||
Up,
|
|
||||||
Down,
|
|
||||||
Left,
|
|
||||||
Right,
|
|
||||||
}
|
|
||||||
|
|
||||||
func (xy XY) toF64() [2]float64 {
|
|
||||||
return [2]float64{
|
|
||||||
float64(xy[0]),
|
|
||||||
float64(xy[1]),
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func abs(i int) int {
|
|
||||||
if i < 0 {
|
|
||||||
return i * -1
|
|
||||||
}
|
|
||||||
return i
|
|
||||||
}
|
|
||||||
|
|
||||||
// Abs returns the XY with all fields made positive, if they weren't already
|
|
||||||
func (xy XY) Abs() XY {
|
|
||||||
return XY{abs(xy[0]), abs(xy[1])}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Unit returns the XY with each field divided by its absolute value (i.e.
|
|
||||||
// scaled down to 1 or -1). Fields which are 0 are left alone
|
|
||||||
func (xy XY) Unit() XY {
|
|
||||||
for i := range xy {
|
|
||||||
if xy[i] > 0 {
|
|
||||||
xy[i] = 1
|
|
||||||
} else if xy[i] < 0 {
|
|
||||||
xy[i] = -1
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return xy
|
|
||||||
}
|
|
||||||
|
|
||||||
// Len returns the length (aka magnitude) of the XY as a vector.
|
|
||||||
func (xy XY) Len() int {
|
|
||||||
if xy[0] == 0 {
|
|
||||||
return abs(xy[1])
|
|
||||||
} else if xy[1] == 0 {
|
|
||||||
return abs(xy[0])
|
|
||||||
}
|
|
||||||
|
|
||||||
xyf := xy.toF64()
|
|
||||||
lf := math.Sqrt((xyf[0] * xyf[0]) + (xyf[1] * xyf[1]))
|
|
||||||
return Rounder.Round(lf)
|
|
||||||
}
|
|
||||||
|
|
||||||
// Add returns the result of adding the two XYs' fields individually
|
|
||||||
func (xy XY) Add(xy2 XY) XY {
|
|
||||||
xy[0] += xy2[0]
|
|
||||||
xy[1] += xy2[1]
|
|
||||||
return xy
|
|
||||||
}
|
|
||||||
|
|
||||||
// Mul returns the result of multiplying the two XYs' fields individually
|
|
||||||
func (xy XY) Mul(xy2 XY) XY {
|
|
||||||
xy[0] *= xy2[0]
|
|
||||||
xy[1] *= xy2[1]
|
|
||||||
return xy
|
|
||||||
}
|
|
||||||
|
|
||||||
// Div returns the results of dividing the two XYs' field individually.
|
|
||||||
func (xy XY) Div(xy2 XY) XY {
|
|
||||||
xyf, xy2f := xy.toF64(), xy2.toF64()
|
|
||||||
return XY{
|
|
||||||
Rounder.Round(xyf[0] / xy2f[0]),
|
|
||||||
Rounder.Round(xyf[1] / xy2f[1]),
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Scale returns the result of multiplying both of the XY's fields by the scalar
|
|
||||||
func (xy XY) Scale(scalar int) XY {
|
|
||||||
return xy.Mul(XY{scalar, scalar})
|
|
||||||
}
|
|
||||||
|
|
||||||
// Inv inverses the XY, a shortcut for xy.Scale(-1)
|
|
||||||
func (xy XY) Inv() XY {
|
|
||||||
return xy.Scale(-1)
|
|
||||||
}
|
|
||||||
|
|
||||||
// Sub subtracts xy2 from xy and returns the result. A shortcut for
|
|
||||||
// xy.Add(xy2.Inv())
|
|
||||||
func (xy XY) Sub(xy2 XY) XY {
|
|
||||||
return xy.Add(xy2.Inv())
|
|
||||||
}
|
|
||||||
|
|
||||||
// Midpoint returns the midpoint between the two XYs.
|
|
||||||
func (xy XY) Midpoint(xy2 XY) XY {
|
|
||||||
return xy.Add(xy2.Sub(xy).Div(XY{2, 2}))
|
|
||||||
}
|
|
||||||
|
|
||||||
// Min returns an XY whose fields are the minimum values of the two XYs'
|
|
||||||
// fields compared individually
|
|
||||||
func (xy XY) Min(xy2 XY) XY {
|
|
||||||
for i := range xy {
|
|
||||||
if xy2[i] < xy[i] {
|
|
||||||
xy[i] = xy2[i]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return xy
|
|
||||||
}
|
|
||||||
|
|
||||||
// Max returns an XY whose fields are the Maximum values of the two XYs'
|
|
||||||
// fields compared individually
|
|
||||||
func (xy XY) Max(xy2 XY) XY {
|
|
||||||
for i := range xy {
|
|
||||||
if xy2[i] > xy[i] {
|
|
||||||
xy[i] = xy2[i]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return xy
|
|
||||||
}
|
|
127
gim/geo/rect.go
127
gim/geo/rect.go
@ -1,127 +0,0 @@
|
|||||||
package geo
|
|
||||||
|
|
||||||
import (
|
|
||||||
"fmt"
|
|
||||||
)
|
|
||||||
|
|
||||||
// Rect describes a rectangle based on the position of its top-left corner and
|
|
||||||
// size
|
|
||||||
type Rect struct {
|
|
||||||
TopLeft XY
|
|
||||||
Size XY
|
|
||||||
}
|
|
||||||
|
|
||||||
// Edge describes a straight edge starting at its first XY and ending at its
|
|
||||||
// second
|
|
||||||
type Edge [2]XY
|
|
||||||
|
|
||||||
// EdgeCoord returns the coordinate of the edge indicated by the given direction
|
|
||||||
// (Up, Down, Left, or Right). The coordinate will be for the axis applicable to
|
|
||||||
// the direction, so for Left/Right it will be the x coordinate and for Up/Down
|
|
||||||
// the y.
|
|
||||||
func (r Rect) EdgeCoord(dir XY) int {
|
|
||||||
switch dir {
|
|
||||||
case Up:
|
|
||||||
return r.TopLeft[1]
|
|
||||||
case Down:
|
|
||||||
return r.TopLeft[1] + r.Size[1] - 1
|
|
||||||
case Left:
|
|
||||||
return r.TopLeft[0]
|
|
||||||
case Right:
|
|
||||||
return r.TopLeft[0] + r.Size[0] - 1
|
|
||||||
default:
|
|
||||||
panic(fmt.Sprintf("unsupported direction: %#v", dir))
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Corner returns the position of the corner identified by the given directions
|
|
||||||
// (Left/Right, Up/Down)
|
|
||||||
func (r Rect) Corner(xDir, yDir XY) XY {
|
|
||||||
switch {
|
|
||||||
case r.Size[0] == 0 || r.Size[1] == 0:
|
|
||||||
panic(fmt.Sprintf("rectangle with non-multidimensional size has no corners: %v", r.Size))
|
|
||||||
case xDir == Left && yDir == Up:
|
|
||||||
return r.TopLeft
|
|
||||||
case xDir == Right && yDir == Up:
|
|
||||||
return r.TopLeft.Add(r.Size.Mul(Right)).Add(XY{-1, 0})
|
|
||||||
case xDir == Left && yDir == Down:
|
|
||||||
return r.TopLeft.Add(r.Size.Mul(Down)).Add(XY{0, -1})
|
|
||||||
case xDir == Right && yDir == Down:
|
|
||||||
return r.TopLeft.Add(r.Size).Add(XY{-1, -1})
|
|
||||||
default:
|
|
||||||
panic(fmt.Sprintf("unsupported Corner args: %v, %v", xDir, yDir))
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Edge returns an Edge instance for the edge of the Rect indicated by the given
|
|
||||||
// direction (Up, Down, Left, or Right). secDir indicates the direction the
|
|
||||||
// returned Edge should be pointing (i.e. the order of its XY's) and must be
|
|
||||||
// perpendicular to dir
|
|
||||||
func (r Rect) Edge(dir, secDir XY) Edge {
|
|
||||||
var e Edge
|
|
||||||
switch dir {
|
|
||||||
case Up:
|
|
||||||
e[0], e[1] = r.Corner(Left, Up), r.Corner(Right, Up)
|
|
||||||
case Down:
|
|
||||||
e[0], e[1] = r.Corner(Left, Down), r.Corner(Right, Down)
|
|
||||||
case Left:
|
|
||||||
e[0], e[1] = r.Corner(Left, Up), r.Corner(Left, Down)
|
|
||||||
case Right:
|
|
||||||
e[0], e[1] = r.Corner(Right, Up), r.Corner(Right, Down)
|
|
||||||
default:
|
|
||||||
panic(fmt.Sprintf("unsupported direction: %#v", dir))
|
|
||||||
}
|
|
||||||
|
|
||||||
switch secDir {
|
|
||||||
case Left, Up:
|
|
||||||
e[0], e[1] = e[1], e[0]
|
|
||||||
default:
|
|
||||||
// do nothing
|
|
||||||
}
|
|
||||||
return e
|
|
||||||
}
|
|
||||||
|
|
||||||
// Midpoint returns the point which is the midpoint of the Edge
|
|
||||||
func (e Edge) Midpoint() XY {
|
|
||||||
return e[0].Midpoint(e[1])
|
|
||||||
}
|
|
||||||
|
|
||||||
func (r Rect) halfSize() XY {
|
|
||||||
return r.Size.Div(XY{2, 2})
|
|
||||||
}
|
|
||||||
|
|
||||||
// Center returns the centerpoint of the rectangle.
|
|
||||||
func (r Rect) Center() XY {
|
|
||||||
return r.TopLeft.Add(r.halfSize())
|
|
||||||
}
|
|
||||||
|
|
||||||
// Translate returns an instance of Rect which is the same as this one but
|
|
||||||
// translated by the given amount.
|
|
||||||
func (r Rect) Translate(by XY) Rect {
|
|
||||||
r.TopLeft = r.TopLeft.Add(by)
|
|
||||||
return r
|
|
||||||
}
|
|
||||||
|
|
||||||
// Centered returns an instance of Rect which is this one but translated to be
|
|
||||||
// centered on the given point.
|
|
||||||
func (r Rect) Centered(on XY) Rect {
|
|
||||||
r.TopLeft = on.Sub(r.halfSize())
|
|
||||||
return r
|
|
||||||
}
|
|
||||||
|
|
||||||
// Union returns the smallest Rect which encompasses the given Rect and the one
|
|
||||||
// being called upon.
|
|
||||||
func (r Rect) Union(r2 Rect) Rect {
|
|
||||||
if r.Size == Zero {
|
|
||||||
return r2
|
|
||||||
} else if r2.Size == Zero {
|
|
||||||
return r
|
|
||||||
}
|
|
||||||
|
|
||||||
tl := r.TopLeft.Min(r2.TopLeft)
|
|
||||||
br := r.Corner(Right, Down).Max(r2.Corner(Right, Down))
|
|
||||||
return Rect{
|
|
||||||
TopLeft: tl,
|
|
||||||
Size: br.Sub(tl).Add(XY{1, 1}),
|
|
||||||
}
|
|
||||||
}
|
|
@ -1,113 +0,0 @@
|
|||||||
package geo
|
|
||||||
|
|
||||||
import (
|
|
||||||
. "testing"
|
|
||||||
|
|
||||||
"github.com/stretchr/testify/assert"
|
|
||||||
)
|
|
||||||
|
|
||||||
func TestRect(t *T) {
|
|
||||||
r := Rect{
|
|
||||||
TopLeft: XY{1, 2},
|
|
||||||
Size: XY{2, 2},
|
|
||||||
}
|
|
||||||
|
|
||||||
assert.Equal(t, 2, r.EdgeCoord(Up))
|
|
||||||
assert.Equal(t, 3, r.EdgeCoord(Down))
|
|
||||||
assert.Equal(t, 1, r.EdgeCoord(Left))
|
|
||||||
assert.Equal(t, 2, r.EdgeCoord(Right))
|
|
||||||
|
|
||||||
lu := XY{1, 2}
|
|
||||||
ld := XY{1, 3}
|
|
||||||
ru := XY{2, 2}
|
|
||||||
rd := XY{2, 3}
|
|
||||||
|
|
||||||
assert.Equal(t, lu, r.Corner(Left, Up))
|
|
||||||
assert.Equal(t, ld, r.Corner(Left, Down))
|
|
||||||
assert.Equal(t, ru, r.Corner(Right, Up))
|
|
||||||
assert.Equal(t, rd, r.Corner(Right, Down))
|
|
||||||
|
|
||||||
assert.Equal(t, Edge{lu, ld}, r.Edge(Left, Down))
|
|
||||||
assert.Equal(t, Edge{ru, rd}, r.Edge(Right, Down))
|
|
||||||
assert.Equal(t, Edge{lu, ru}, r.Edge(Up, Right))
|
|
||||||
assert.Equal(t, Edge{ld, rd}, r.Edge(Down, Right))
|
|
||||||
assert.Equal(t, Edge{ld, lu}, r.Edge(Left, Up))
|
|
||||||
assert.Equal(t, Edge{rd, ru}, r.Edge(Right, Up))
|
|
||||||
assert.Equal(t, Edge{ru, lu}, r.Edge(Up, Left))
|
|
||||||
assert.Equal(t, Edge{rd, ld}, r.Edge(Down, Left))
|
|
||||||
}
|
|
||||||
|
|
||||||
func TestRectCenter(t *T) {
|
|
||||||
assertCentered := func(exp, given Rect, center XY) {
|
|
||||||
got := given.Centered(center)
|
|
||||||
assert.Equal(t, exp, got)
|
|
||||||
assert.Equal(t, center, got.Center())
|
|
||||||
}
|
|
||||||
|
|
||||||
{
|
|
||||||
r := Rect{
|
|
||||||
Size: XY{4, 4},
|
|
||||||
}
|
|
||||||
assert.Equal(t, XY{2, 2}, r.Center())
|
|
||||||
assertCentered(
|
|
||||||
Rect{TopLeft: XY{1, 1}, Size: XY{4, 4}},
|
|
||||||
r, XY{3, 3},
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
{
|
|
||||||
r := Rect{
|
|
||||||
Size: XY{5, 5},
|
|
||||||
}
|
|
||||||
assert.Equal(t, XY{3, 3}, r.Center())
|
|
||||||
assertCentered(
|
|
||||||
Rect{TopLeft: XY{0, 0}, Size: XY{5, 5}},
|
|
||||||
r, XY{3, 3},
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func TestRectUnion(t *T) {
|
|
||||||
assertUnion := func(exp, r1, r2 Rect) {
|
|
||||||
assert.Equal(t, exp, r1.Union(r2))
|
|
||||||
assert.Equal(t, exp, r2.Union(r1))
|
|
||||||
}
|
|
||||||
|
|
||||||
{ // Zero
|
|
||||||
r := Rect{TopLeft: XY{1, 1}, Size: XY{2, 2}}
|
|
||||||
assertUnion(r, r, Rect{})
|
|
||||||
}
|
|
||||||
|
|
||||||
{ // Equal
|
|
||||||
r := Rect{Size: XY{2, 2}}
|
|
||||||
assertUnion(r, r, r)
|
|
||||||
}
|
|
||||||
|
|
||||||
{ // Overlapping corner
|
|
||||||
r1 := Rect{TopLeft: XY{0, 0}, Size: XY{2, 2}}
|
|
||||||
r2 := Rect{TopLeft: XY{1, 1}, Size: XY{2, 2}}
|
|
||||||
ex := Rect{TopLeft: XY{0, 0}, Size: XY{3, 3}}
|
|
||||||
assertUnion(ex, r1, r2)
|
|
||||||
}
|
|
||||||
|
|
||||||
{ // 2 overlapping corners
|
|
||||||
r1 := Rect{TopLeft: XY{0, 0}, Size: XY{4, 4}}
|
|
||||||
r2 := Rect{TopLeft: XY{1, 1}, Size: XY{4, 2}}
|
|
||||||
ex := Rect{TopLeft: XY{0, 0}, Size: XY{5, 4}}
|
|
||||||
assertUnion(ex, r1, r2)
|
|
||||||
}
|
|
||||||
|
|
||||||
{ // Shared edge
|
|
||||||
r1 := Rect{TopLeft: XY{0, 0}, Size: XY{2, 1}}
|
|
||||||
r2 := Rect{TopLeft: XY{1, 0}, Size: XY{1, 2}}
|
|
||||||
ex := Rect{TopLeft: XY{0, 0}, Size: XY{2, 2}}
|
|
||||||
assertUnion(ex, r1, r2)
|
|
||||||
}
|
|
||||||
|
|
||||||
{ // Adjacent edge
|
|
||||||
r1 := Rect{TopLeft: XY{0, 0}, Size: XY{2, 2}}
|
|
||||||
r2 := Rect{TopLeft: XY{2, 0}, Size: XY{2, 2}}
|
|
||||||
ex := Rect{TopLeft: XY{0, 0}, Size: XY{4, 2}}
|
|
||||||
assertUnion(ex, r1, r2)
|
|
||||||
}
|
|
||||||
}
|
|
@ -1,33 +0,0 @@
|
|||||||
package geo
|
|
||||||
|
|
||||||
import (
|
|
||||||
"math"
|
|
||||||
)
|
|
||||||
|
|
||||||
// RounderFunc is a function which converts a floating point number into an
|
|
||||||
// integer.
|
|
||||||
type RounderFunc func(float64) int64
|
|
||||||
|
|
||||||
// Round is helper for calling the RounderFunc and converting the result to an
|
|
||||||
// int.
|
|
||||||
func (rf RounderFunc) Round(f float64) int {
|
|
||||||
return int(rf(f))
|
|
||||||
}
|
|
||||||
|
|
||||||
// A few RounderFuncs which can be used. Set the Rounder global variable to pick
|
|
||||||
// one.
|
|
||||||
var (
|
|
||||||
Floor RounderFunc = func(f float64) int64 { return int64(math.Floor(f)) }
|
|
||||||
Ceil RounderFunc = func(f float64) int64 { return int64(math.Ceil(f)) }
|
|
||||||
Round RounderFunc = func(f float64) int64 {
|
|
||||||
if f < 0 {
|
|
||||||
f = math.Ceil(f - 0.5)
|
|
||||||
}
|
|
||||||
f = math.Floor(f + 0.5)
|
|
||||||
return int64(f)
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
|
||||||
// Rounder is the RounderFunc which will be used by all functions and methods in
|
|
||||||
// this package when needed.
|
|
||||||
var Rounder = Ceil
|
|
91
gim/main.go
91
gim/main.go
@ -1,91 +0,0 @@
|
|||||||
package main
|
|
||||||
|
|
||||||
import (
|
|
||||||
"math/rand"
|
|
||||||
"time"
|
|
||||||
|
|
||||||
"github.com/mediocregopher/ginger/gg"
|
|
||||||
"github.com/mediocregopher/ginger/gim/geo"
|
|
||||||
"github.com/mediocregopher/ginger/gim/terminal"
|
|
||||||
"github.com/mediocregopher/ginger/gim/view"
|
|
||||||
)
|
|
||||||
|
|
||||||
// TODO be able to draw circular graphs
|
|
||||||
// TODO audit all steps, make sure everything is deterministic
|
|
||||||
// TODO self-edges
|
|
||||||
|
|
||||||
//const (
|
|
||||||
// framerate = 10
|
|
||||||
// frameperiod = time.Second / time.Duration(framerate)
|
|
||||||
//)
|
|
||||||
|
|
||||||
//func debugf(str string, args ...interface{}) {
|
|
||||||
// if !strings.HasSuffix(str, "\n") {
|
|
||||||
// str += "\n"
|
|
||||||
// }
|
|
||||||
// fmt.Fprintf(os.Stderr, str, args...)
|
|
||||||
//}
|
|
||||||
|
|
||||||
func mkGraph() (*gg.Graph, gg.Value) {
|
|
||||||
a := gg.NewValue("a")
|
|
||||||
aE0 := gg.NewValue("aE0")
|
|
||||||
aE1 := gg.NewValue("aE1")
|
|
||||||
aE2 := gg.NewValue("aE2")
|
|
||||||
aE3 := gg.NewValue("aE3")
|
|
||||||
b0 := gg.NewValue("b0")
|
|
||||||
b1 := gg.NewValue("b1")
|
|
||||||
b2 := gg.NewValue("b2")
|
|
||||||
b3 := gg.NewValue("b3")
|
|
||||||
oaE0 := gg.ValueOut(a, aE0)
|
|
||||||
oaE1 := gg.ValueOut(a, aE1)
|
|
||||||
oaE2 := gg.ValueOut(a, aE2)
|
|
||||||
oaE3 := gg.ValueOut(a, aE3)
|
|
||||||
g := gg.Null
|
|
||||||
g = g.AddValueIn(oaE0, b0)
|
|
||||||
g = g.AddValueIn(oaE1, b1)
|
|
||||||
g = g.AddValueIn(oaE2, b2)
|
|
||||||
g = g.AddValueIn(oaE3, b3)
|
|
||||||
|
|
||||||
c := gg.NewValue("c")
|
|
||||||
empty := gg.NewValue("")
|
|
||||||
jE := gg.JunctionOut([]gg.OpenEdge{
|
|
||||||
gg.ValueOut(b0, empty),
|
|
||||||
gg.ValueOut(b1, empty),
|
|
||||||
gg.ValueOut(b2, empty),
|
|
||||||
gg.ValueOut(b3, empty),
|
|
||||||
}, gg.NewValue("jE"))
|
|
||||||
g = g.AddValueIn(jE, c)
|
|
||||||
|
|
||||||
// TODO this really fucks it up
|
|
||||||
//d := gg.NewValue("d")
|
|
||||||
//deE := gg.ValueOut(d, gg.NewValue("deE"))
|
|
||||||
//g = g.AddValueIn(deE, gg.NewValue("e"))
|
|
||||||
|
|
||||||
return g, c
|
|
||||||
}
|
|
||||||
|
|
||||||
//func mkGraph() *gg.Graph {
|
|
||||||
// g := gg.Null
|
|
||||||
// g = g.AddValueIn(gg.ValueOut(str("a"), str("e")), str("b"))
|
|
||||||
// return g
|
|
||||||
//}
|
|
||||||
|
|
||||||
func main() {
|
|
||||||
rand.Seed(time.Now().UnixNano())
|
|
||||||
term := terminal.New()
|
|
||||||
wSize := term.WindowSize()
|
|
||||||
center := geo.Zero.Midpoint(wSize)
|
|
||||||
|
|
||||||
g, start := mkGraph()
|
|
||||||
view := view.New(g, start, geo.Right, geo.Down)
|
|
||||||
viewBuf := terminal.NewBuffer()
|
|
||||||
view.Draw(viewBuf)
|
|
||||||
|
|
||||||
buf := terminal.NewBuffer()
|
|
||||||
buf.DrawBufferCentered(center, viewBuf)
|
|
||||||
|
|
||||||
term.Clear()
|
|
||||||
term.WriteBuffer(geo.Zero, buf)
|
|
||||||
term.SetPos(wSize.Add(geo.XY{0, -1}))
|
|
||||||
term.Draw()
|
|
||||||
}
|
|
@ -1,217 +0,0 @@
|
|||||||
package terminal
|
|
||||||
|
|
||||||
import (
|
|
||||||
"fmt"
|
|
||||||
"strconv"
|
|
||||||
"unicode"
|
|
||||||
|
|
||||||
"github.com/mediocregopher/ginger/gim/geo"
|
|
||||||
)
|
|
||||||
|
|
||||||
// Reset all custom styles
|
|
||||||
const ansiReset = "\033[0m"
|
|
||||||
|
|
||||||
// Color describes the foreground or background color of text
|
|
||||||
type Color int
|
|
||||||
|
|
||||||
// Available Color values
|
|
||||||
const (
|
|
||||||
// whatever the terminal's default color scheme is
|
|
||||||
Default = iota
|
|
||||||
|
|
||||||
Black
|
|
||||||
Red
|
|
||||||
Green
|
|
||||||
Yellow
|
|
||||||
Blue
|
|
||||||
Magenta
|
|
||||||
Cyan
|
|
||||||
White
|
|
||||||
)
|
|
||||||
|
|
||||||
type bufStyle struct {
|
|
||||||
fgColor Color
|
|
||||||
bgColor Color
|
|
||||||
}
|
|
||||||
|
|
||||||
// returns foreground and background ansi codes
|
|
||||||
func (bf bufStyle) ansi() (string, string) {
|
|
||||||
var fg, bg string
|
|
||||||
if bf.fgColor != Default {
|
|
||||||
fg = "\033[0;3" + strconv.Itoa(int(bf.fgColor)-1) + "m"
|
|
||||||
}
|
|
||||||
if bf.bgColor != Default {
|
|
||||||
bg = "\033[0;4" + strconv.Itoa(int(bf.bgColor)-1) + "m"
|
|
||||||
}
|
|
||||||
return fg, bg
|
|
||||||
}
|
|
||||||
|
|
||||||
// returns the ansi sequence which would modify the style to the given one
|
|
||||||
func (bf bufStyle) diffTo(bf2 bufStyle) string {
|
|
||||||
// this implementation is naive, but whatever
|
|
||||||
if bf == bf2 {
|
|
||||||
return ""
|
|
||||||
}
|
|
||||||
|
|
||||||
fg, bg := bf2.ansi()
|
|
||||||
if (bf == bufStyle{}) {
|
|
||||||
return fg + bg
|
|
||||||
}
|
|
||||||
return ansiReset + fg + bg
|
|
||||||
}
|
|
||||||
|
|
||||||
type bufPoint struct {
|
|
||||||
r rune
|
|
||||||
bufStyle
|
|
||||||
}
|
|
||||||
|
|
||||||
// Buffer describes an infinitely sized terminal buffer to which anything may be
|
|
||||||
// drawn, and which will efficiently generate strings representing the drawn
|
|
||||||
// text.
|
|
||||||
type Buffer struct {
|
|
||||||
currStyle bufStyle
|
|
||||||
currPos geo.XY
|
|
||||||
m *mat
|
|
||||||
max geo.XY
|
|
||||||
}
|
|
||||||
|
|
||||||
// NewBuffer initializes and returns a new empty buffer. The proper way to clear
|
|
||||||
// a buffer is to toss the old one and generate a new one.
|
|
||||||
func NewBuffer() *Buffer {
|
|
||||||
return &Buffer{
|
|
||||||
m: newMat(),
|
|
||||||
max: geo.XY{-1, -1},
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Copy creates a new identical instance of this Buffer and returns it.
|
|
||||||
func (b *Buffer) Copy() *Buffer {
|
|
||||||
b2 := NewBuffer()
|
|
||||||
b.m.iter(func(x, y int, v interface{}) bool {
|
|
||||||
b2.setRune(geo.XY{x, y}, v.(bufPoint))
|
|
||||||
return true
|
|
||||||
})
|
|
||||||
b2.currStyle = b.currStyle
|
|
||||||
b2.currPos = b.currPos
|
|
||||||
return b2
|
|
||||||
}
|
|
||||||
|
|
||||||
func (b *Buffer) setRune(at geo.XY, p bufPoint) {
|
|
||||||
b.m.set(at[0], at[1], p)
|
|
||||||
b.max = b.max.Max(at)
|
|
||||||
}
|
|
||||||
|
|
||||||
// WriteRune writes the given rune to the Buffer at whatever the current
|
|
||||||
// position is, with whatever the current styling is.
|
|
||||||
func (b *Buffer) WriteRune(r rune) {
|
|
||||||
if r == '\n' {
|
|
||||||
b.currPos[0], b.currPos[1] = 0, b.currPos[1]+1
|
|
||||||
return
|
|
||||||
} else if r == '\r' {
|
|
||||||
b.currPos[0] = 0
|
|
||||||
} else if !unicode.IsPrint(r) {
|
|
||||||
panic(fmt.Sprintf("character %q is not supported by terminal.Buffer", r))
|
|
||||||
}
|
|
||||||
|
|
||||||
b.setRune(b.currPos, bufPoint{
|
|
||||||
r: r,
|
|
||||||
bufStyle: b.currStyle,
|
|
||||||
})
|
|
||||||
b.currPos[0]++
|
|
||||||
}
|
|
||||||
|
|
||||||
// WriteString writes the given string to the Buffer at whatever the current
|
|
||||||
// position is, with whatever the current styling is.
|
|
||||||
func (b *Buffer) WriteString(s string) {
|
|
||||||
for _, r := range s {
|
|
||||||
b.WriteRune(r)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// SetPos sets the cursor position in the Buffer, so Print operations will begin
|
|
||||||
// at that point. Remember that the origin is at point (0, 0).
|
|
||||||
func (b *Buffer) SetPos(xy geo.XY) {
|
|
||||||
b.currPos = xy
|
|
||||||
}
|
|
||||||
|
|
||||||
// SetFGColor sets subsequent text's foreground color.
|
|
||||||
func (b *Buffer) SetFGColor(c Color) {
|
|
||||||
b.currStyle.fgColor = c
|
|
||||||
}
|
|
||||||
|
|
||||||
// SetBGColor sets subsequent text's background color.
|
|
||||||
func (b *Buffer) SetBGColor(c Color) {
|
|
||||||
b.currStyle.bgColor = c
|
|
||||||
}
|
|
||||||
|
|
||||||
// ResetStyle unsets all text styling options which have been set.
|
|
||||||
func (b *Buffer) ResetStyle() {
|
|
||||||
b.currStyle = bufStyle{}
|
|
||||||
}
|
|
||||||
|
|
||||||
// String renders and returns a string which, when printed to a terminal, will
|
|
||||||
// print the Buffer's contents at the terminal's current cursor position.
|
|
||||||
func (b *Buffer) String() string {
|
|
||||||
s := ansiReset // always start with a reset
|
|
||||||
var style bufStyle
|
|
||||||
var pos geo.XY
|
|
||||||
move := func(to geo.XY) {
|
|
||||||
diff := to.Sub(pos)
|
|
||||||
if diff[0] > 0 {
|
|
||||||
s += "\033[" + strconv.Itoa(diff[0]) + "C"
|
|
||||||
} else if diff[0] < 0 {
|
|
||||||
s += "\033[" + strconv.Itoa(-diff[0]) + "D"
|
|
||||||
}
|
|
||||||
if diff[1] > 0 {
|
|
||||||
s += "\033[" + strconv.Itoa(diff[1]) + "B"
|
|
||||||
} else if diff[1] < 0 {
|
|
||||||
s += "\033[" + strconv.Itoa(-diff[1]) + "A"
|
|
||||||
}
|
|
||||||
pos = to
|
|
||||||
}
|
|
||||||
|
|
||||||
b.m.iter(func(x, y int, v interface{}) bool {
|
|
||||||
p := v.(bufPoint)
|
|
||||||
move(geo.XY{x, y})
|
|
||||||
s += style.diffTo(p.bufStyle)
|
|
||||||
style = p.bufStyle
|
|
||||||
s += string(p.r)
|
|
||||||
pos[0]++
|
|
||||||
return true
|
|
||||||
})
|
|
||||||
return s
|
|
||||||
}
|
|
||||||
|
|
||||||
// DrawBuffer copies the given Buffer onto this one, with the given's top-left
|
|
||||||
// corner being at the given position. The given buffer may be the same as this
|
|
||||||
// one.
|
|
||||||
//
|
|
||||||
// Calling this method does not affect this Buffer's current cursor position or
|
|
||||||
// style.
|
|
||||||
func (b *Buffer) DrawBuffer(at geo.XY, b2 *Buffer) {
|
|
||||||
if b == b2 {
|
|
||||||
b2 = b2.Copy()
|
|
||||||
}
|
|
||||||
b2.m.iter(func(x, y int, v interface{}) bool {
|
|
||||||
x += at[0]
|
|
||||||
y += at[1]
|
|
||||||
if x < 0 || y < 0 {
|
|
||||||
return true
|
|
||||||
}
|
|
||||||
b.setRune(geo.XY{x, y}, v.(bufPoint))
|
|
||||||
return true
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
// DrawBufferCentered is like DrawBuffer, but centered around the given point
|
|
||||||
// instead of translated by it.
|
|
||||||
func (b *Buffer) DrawBufferCentered(around geo.XY, b2 *Buffer) {
|
|
||||||
b2rect := geo.Rect{Size: b2.Size()}
|
|
||||||
b.DrawBuffer(b2rect.Centered(around).TopLeft, b2)
|
|
||||||
}
|
|
||||||
|
|
||||||
// Size returns the dimensions of the Buffer's current area which has been
|
|
||||||
// written to.
|
|
||||||
func (b *Buffer) Size() geo.XY {
|
|
||||||
return b.max.Add(geo.XY{1, 1})
|
|
||||||
}
|
|
@ -1,59 +0,0 @@
|
|||||||
package main
|
|
||||||
|
|
||||||
import (
|
|
||||||
"log"
|
|
||||||
"time"
|
|
||||||
|
|
||||||
"github.com/mediocregopher/ginger/gim/geo"
|
|
||||||
"github.com/mediocregopher/ginger/gim/terminal"
|
|
||||||
)
|
|
||||||
|
|
||||||
func main() {
|
|
||||||
b := terminal.NewBuffer()
|
|
||||||
b.WriteString("this is fun")
|
|
||||||
|
|
||||||
b.SetFGColor(terminal.Blue)
|
|
||||||
b.SetBGColor(terminal.Green)
|
|
||||||
b.SetPos(geo.XY{18, 0})
|
|
||||||
b.WriteString("blue and green")
|
|
||||||
|
|
||||||
b.ResetStyle()
|
|
||||||
b.SetFGColor(terminal.Red)
|
|
||||||
b.SetPos(geo.XY{3, 3})
|
|
||||||
b.WriteString("red!!!")
|
|
||||||
|
|
||||||
b.ResetStyle()
|
|
||||||
b.SetFGColor(terminal.Blue)
|
|
||||||
b.SetPos(geo.XY{20, 0})
|
|
||||||
b.WriteString("boo")
|
|
||||||
|
|
||||||
bcp := b.Copy()
|
|
||||||
b.DrawBuffer(geo.XY{2, 2}, bcp)
|
|
||||||
b.DrawBuffer(geo.XY{-1, 1}, bcp)
|
|
||||||
|
|
||||||
brect := terminal.NewBuffer()
|
|
||||||
brect.DrawRect(geo.Rect{Size: b.Size().Add(geo.XY{2, 2})}, terminal.SingleLine)
|
|
||||||
log.Printf("b.Size:%v", b.Size())
|
|
||||||
brect.DrawBuffer(geo.XY{1, 1}, b)
|
|
||||||
|
|
||||||
t := terminal.New()
|
|
||||||
p := geo.XY{0, 0}
|
|
||||||
dirH, dirV := geo.Right, geo.Down
|
|
||||||
wsize := t.WindowSize()
|
|
||||||
for range time.Tick(time.Second / 15) {
|
|
||||||
t.Clear()
|
|
||||||
t.WriteBuffer(p, brect)
|
|
||||||
t.Draw()
|
|
||||||
|
|
||||||
brectSize := brect.Size()
|
|
||||||
p = p.Add(dirH).Add(dirV)
|
|
||||||
if p[0] < 0 || p[0]+brectSize[0] > wsize[0] {
|
|
||||||
dirH = dirH.Scale(-1)
|
|
||||||
p = p.Add(dirH.Scale(2))
|
|
||||||
}
|
|
||||||
if p[1] < 0 || p[1]+brectSize[1] > wsize[1] {
|
|
||||||
dirV = dirV.Scale(-1)
|
|
||||||
p = p.Add(dirV.Scale(2))
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
@ -1,117 +0,0 @@
|
|||||||
package terminal
|
|
||||||
|
|
||||||
import (
|
|
||||||
"container/list"
|
|
||||||
)
|
|
||||||
|
|
||||||
type matEl struct {
|
|
||||||
x int
|
|
||||||
v interface{}
|
|
||||||
}
|
|
||||||
|
|
||||||
type matRow struct {
|
|
||||||
y int
|
|
||||||
l *list.List
|
|
||||||
}
|
|
||||||
|
|
||||||
// a 2-d sparse matrix
|
|
||||||
type mat struct {
|
|
||||||
rows *list.List
|
|
||||||
|
|
||||||
currY int
|
|
||||||
currRowEl *list.Element
|
|
||||||
currEl *list.Element
|
|
||||||
}
|
|
||||||
|
|
||||||
func newMat() *mat {
|
|
||||||
return &mat{
|
|
||||||
rows: list.New(),
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func (m *mat) getRow(y int) *list.List {
|
|
||||||
m.currY = y // this will end up being true no matter what
|
|
||||||
if m.currRowEl == nil { // first call
|
|
||||||
l := list.New()
|
|
||||||
m.currRowEl = m.rows.PushFront(matRow{y: y, l: l})
|
|
||||||
return l
|
|
||||||
|
|
||||||
} else if m.currRowEl.Value.(matRow).y > y {
|
|
||||||
m.currRowEl = m.rows.Front()
|
|
||||||
}
|
|
||||||
|
|
||||||
for {
|
|
||||||
currRow := m.currRowEl.Value.(matRow)
|
|
||||||
switch {
|
|
||||||
case currRow.y == y:
|
|
||||||
return currRow.l
|
|
||||||
case currRow.y < y:
|
|
||||||
if m.currRowEl = m.currRowEl.Next(); m.currRowEl == nil {
|
|
||||||
l := list.New()
|
|
||||||
m.currRowEl = m.rows.PushBack(matRow{y: y, l: l})
|
|
||||||
return l
|
|
||||||
}
|
|
||||||
default: // currRow.y > y
|
|
||||||
l := list.New()
|
|
||||||
m.currRowEl = m.rows.InsertBefore(matRow{y: y, l: l}, m.currRowEl)
|
|
||||||
return l
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func (m *mat) getEl(x, y int) *matEl {
|
|
||||||
var rowL *list.List
|
|
||||||
if m.currRowEl == nil || m.currY != y {
|
|
||||||
rowL = m.getRow(y)
|
|
||||||
m.currEl = rowL.Front()
|
|
||||||
} else {
|
|
||||||
rowL = m.currRowEl.Value.(matRow).l
|
|
||||||
}
|
|
||||||
|
|
||||||
if m.currEl == nil || m.currEl.Value.(*matEl).x > x {
|
|
||||||
if m.currEl = rowL.Front(); m.currEl == nil {
|
|
||||||
// row is empty
|
|
||||||
mel := &matEl{x: x}
|
|
||||||
m.currEl = rowL.PushFront(mel)
|
|
||||||
return mel
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
for {
|
|
||||||
currEl := m.currEl.Value.(*matEl)
|
|
||||||
switch {
|
|
||||||
case currEl.x == x:
|
|
||||||
return currEl
|
|
||||||
case currEl.x < x:
|
|
||||||
if m.currEl = m.currEl.Next(); m.currEl == nil {
|
|
||||||
mel := &matEl{x: x}
|
|
||||||
m.currEl = rowL.PushBack(mel)
|
|
||||||
return mel
|
|
||||||
}
|
|
||||||
default: // currEl.x > x
|
|
||||||
mel := &matEl{x: x}
|
|
||||||
m.currEl = rowL.InsertBefore(mel, m.currEl)
|
|
||||||
return mel
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func (m *mat) get(x, y int) interface{} {
|
|
||||||
return m.getEl(x, y).v
|
|
||||||
}
|
|
||||||
|
|
||||||
func (m *mat) set(x, y int, v interface{}) {
|
|
||||||
m.getEl(x, y).v = v
|
|
||||||
}
|
|
||||||
|
|
||||||
func (m *mat) iter(f func(x, y int, v interface{}) bool) {
|
|
||||||
for rowEl := m.rows.Front(); rowEl != nil; rowEl = rowEl.Next() {
|
|
||||||
row := rowEl.Value.(matRow)
|
|
||||||
for el := row.l.Front(); el != nil; el = el.Next() {
|
|
||||||
mel := el.Value.(*matEl)
|
|
||||||
if !f(mel.x, row.y, mel.v) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
@ -1,59 +0,0 @@
|
|||||||
package terminal
|
|
||||||
|
|
||||||
import (
|
|
||||||
"fmt"
|
|
||||||
"math/rand"
|
|
||||||
"strings"
|
|
||||||
. "testing"
|
|
||||||
"time"
|
|
||||||
)
|
|
||||||
|
|
||||||
func TestMat(t *T) {
|
|
||||||
r := rand.New(rand.NewSource(time.Now().UnixNano()))
|
|
||||||
|
|
||||||
type xy struct {
|
|
||||||
x, y int
|
|
||||||
}
|
|
||||||
|
|
||||||
type action struct {
|
|
||||||
xy
|
|
||||||
set int
|
|
||||||
}
|
|
||||||
|
|
||||||
run := func(aa []action) {
|
|
||||||
aaStr := func(i int) string {
|
|
||||||
s := fmt.Sprintf("%#v", aa[:i+1])
|
|
||||||
return strings.Replace(s, "terminal.", "", -1)
|
|
||||||
}
|
|
||||||
|
|
||||||
m := newMat()
|
|
||||||
mm := map[xy]int{}
|
|
||||||
for i, a := range aa {
|
|
||||||
if a.set > 0 {
|
|
||||||
mm[a.xy] = a.set
|
|
||||||
m.set(a.xy.x, a.xy.y, a.set)
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
|
|
||||||
expI, expOk := mm[a.xy]
|
|
||||||
gotI, gotOk := m.get(a.xy.x, a.xy.y).(int)
|
|
||||||
if expOk != gotOk {
|
|
||||||
t.Fatalf("get failed: expOk:%v gotOk:%v actions:%#v", expOk, gotOk, aaStr(i))
|
|
||||||
} else if expI != gotI {
|
|
||||||
t.Fatalf("get failed: expI:%v gotI:%v actions:%#v", expI, gotI, aaStr(i))
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
for i := 0; i < 10000; i++ {
|
|
||||||
var actions []action
|
|
||||||
for j := r.Intn(1000); j > 0; j-- {
|
|
||||||
a := action{xy: xy{x: r.Intn(5), y: r.Intn(5)}}
|
|
||||||
if r.Intn(3) == 0 {
|
|
||||||
a.set = r.Intn(10000) + 1
|
|
||||||
}
|
|
||||||
actions = append(actions, a)
|
|
||||||
}
|
|
||||||
run(actions)
|
|
||||||
}
|
|
||||||
}
|
|
@ -1,189 +0,0 @@
|
|||||||
package terminal
|
|
||||||
|
|
||||||
import (
|
|
||||||
"fmt"
|
|
||||||
"strings"
|
|
||||||
|
|
||||||
"github.com/mediocregopher/ginger/gim/geo"
|
|
||||||
)
|
|
||||||
|
|
||||||
// SingleLine is a set of single-pixel-width lines.
|
|
||||||
var SingleLine = LineStyle{
|
|
||||||
Horiz: '─',
|
|
||||||
Vert: '│',
|
|
||||||
TopLeft: '┌',
|
|
||||||
TopRight: '┐',
|
|
||||||
BottomLeft: '└',
|
|
||||||
BottomRight: '┘',
|
|
||||||
PerpUp: '┴',
|
|
||||||
PerpDown: '┬',
|
|
||||||
PerpLeft: '┤',
|
|
||||||
PerpRight: '├',
|
|
||||||
ArrowUp: '^',
|
|
||||||
ArrowDown: 'v',
|
|
||||||
ArrowLeft: '<',
|
|
||||||
ArrowRight: '>',
|
|
||||||
}
|
|
||||||
|
|
||||||
// LineStyle defines a set of characters to use together when drawing lines and
|
|
||||||
// corners.
|
|
||||||
type LineStyle struct {
|
|
||||||
Horiz, Vert rune
|
|
||||||
|
|
||||||
// Corner characters, identified as corners of a rectangle
|
|
||||||
TopLeft, TopRight, BottomLeft, BottomRight rune
|
|
||||||
|
|
||||||
// Characters for a straight segment a perpendicular attached
|
|
||||||
PerpUp, PerpDown, PerpLeft, PerpRight rune
|
|
||||||
|
|
||||||
// Characters for pointing arrows
|
|
||||||
ArrowUp, ArrowDown, ArrowLeft, ArrowRight rune
|
|
||||||
}
|
|
||||||
|
|
||||||
// Segment takes two different directions (i.e. geo.Up/Down/Left/Right) and
|
|
||||||
// returns the line character which points in both of those directions.
|
|
||||||
//
|
|
||||||
// For example, SingleLine.Segment(geo.Up, geo.Left) returns '┘'.
|
|
||||||
func (ls LineStyle) Segment(a, b geo.XY) rune {
|
|
||||||
inner := func(a, b geo.XY) rune {
|
|
||||||
type c struct{ a, b geo.XY }
|
|
||||||
switch (c{a, b}) {
|
|
||||||
case c{geo.Up, geo.Down}:
|
|
||||||
return ls.Vert
|
|
||||||
case c{geo.Left, geo.Right}:
|
|
||||||
return ls.Horiz
|
|
||||||
case c{geo.Down, geo.Right}:
|
|
||||||
return ls.TopLeft
|
|
||||||
case c{geo.Down, geo.Left}:
|
|
||||||
return ls.TopRight
|
|
||||||
case c{geo.Up, geo.Right}:
|
|
||||||
return ls.BottomLeft
|
|
||||||
case c{geo.Up, geo.Left}:
|
|
||||||
return ls.BottomRight
|
|
||||||
default:
|
|
||||||
return 0
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if r := inner(a, b); r != 0 {
|
|
||||||
return r
|
|
||||||
} else if r = inner(b, a); r != 0 {
|
|
||||||
return r
|
|
||||||
}
|
|
||||||
panic(fmt.Sprintf("invalid LineStyle.Segment directions: %v, %v", a, b))
|
|
||||||
}
|
|
||||||
|
|
||||||
// Perpendicular returns the line character for a perpendicular segment
|
|
||||||
// traveling in the given direction.
|
|
||||||
func (ls LineStyle) Perpendicular(dir geo.XY) rune {
|
|
||||||
switch dir {
|
|
||||||
case geo.Up:
|
|
||||||
return ls.PerpUp
|
|
||||||
case geo.Down:
|
|
||||||
return ls.PerpDown
|
|
||||||
case geo.Left:
|
|
||||||
return ls.PerpLeft
|
|
||||||
case geo.Right:
|
|
||||||
return ls.PerpRight
|
|
||||||
default:
|
|
||||||
panic(fmt.Sprintf("invalid LineStyle.Perpendicular direction: %v", dir))
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Arrow returns the arrow character for an arrow pointing in the given
|
|
||||||
// direction.
|
|
||||||
func (ls LineStyle) Arrow(dir geo.XY) rune {
|
|
||||||
switch dir {
|
|
||||||
case geo.Up:
|
|
||||||
return ls.ArrowUp
|
|
||||||
case geo.Down:
|
|
||||||
return ls.ArrowDown
|
|
||||||
case geo.Left:
|
|
||||||
return ls.ArrowLeft
|
|
||||||
case geo.Right:
|
|
||||||
return ls.ArrowRight
|
|
||||||
default:
|
|
||||||
panic(fmt.Sprintf("invalid LineStyle.Arrow direction: %v", dir))
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// DrawRect draws the given Rect to the Buffer with the given LineStyle. The
|
|
||||||
// Rect's TopLeft field is used for its position.
|
|
||||||
//
|
|
||||||
// If Rect's Size is not at least 2x2 this does nothing.
|
|
||||||
func (b *Buffer) DrawRect(r geo.Rect, ls LineStyle) {
|
|
||||||
if r.Size[0] < 2 || r.Size[1] < 2 {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
horiz := strings.Repeat(string(ls.Horiz), r.Size[0]-2)
|
|
||||||
|
|
||||||
b.SetPos(r.TopLeft)
|
|
||||||
b.WriteRune(ls.TopLeft)
|
|
||||||
b.WriteString(horiz)
|
|
||||||
b.WriteRune(ls.TopRight)
|
|
||||||
|
|
||||||
for i := 0; i < r.Size[1]-2; i++ {
|
|
||||||
b.SetPos(r.TopLeft.Add(geo.XY{0, i + 1}))
|
|
||||||
b.WriteRune(ls.Vert)
|
|
||||||
b.SetPos(r.TopLeft.Add(geo.XY{r.Size[0] - 1, i + 1}))
|
|
||||||
b.WriteRune(ls.Vert)
|
|
||||||
}
|
|
||||||
|
|
||||||
b.SetPos(r.TopLeft.Add(geo.XY{0, r.Size[1] - 1}))
|
|
||||||
b.WriteRune(ls.BottomLeft)
|
|
||||||
b.WriteString(horiz)
|
|
||||||
b.WriteRune(ls.BottomRight)
|
|
||||||
}
|
|
||||||
|
|
||||||
// DrawLine draws a line from the start point to the ending one, primarily
|
|
||||||
// moving in the given direction, using the given LineStyle to do so.
|
|
||||||
func (b *Buffer) DrawLine(start, end, dir geo.XY, ls LineStyle) {
|
|
||||||
// given the "primary" direction the line should be headed, pick a possible
|
|
||||||
// secondary one which may be used to detour along the path in order to
|
|
||||||
// reach the destination (in the case that the two boxes are diagonal from
|
|
||||||
// each other)
|
|
||||||
var perpDir geo.XY
|
|
||||||
perpDir[0], perpDir[1] = dir[1], dir[0]
|
|
||||||
dirSec := end.Sub(start).Mul(perpDir.Abs()).Unit()
|
|
||||||
mid := start.Midpoint(end)
|
|
||||||
|
|
||||||
along := func(xy, dir geo.XY) int {
|
|
||||||
if dir[0] != 0 {
|
|
||||||
return xy[0]
|
|
||||||
}
|
|
||||||
return xy[1]
|
|
||||||
}
|
|
||||||
|
|
||||||
// collect the points along the line into an array
|
|
||||||
var pts []geo.XY
|
|
||||||
var curr geo.XY
|
|
||||||
midPrim := along(mid, dir)
|
|
||||||
endSec := along(end, dirSec)
|
|
||||||
for curr = start; curr != end; {
|
|
||||||
pts = append(pts, curr)
|
|
||||||
if prim := along(curr, dir); prim == midPrim {
|
|
||||||
if sec := along(curr, dirSec); sec != endSec {
|
|
||||||
curr = curr.Add(dirSec)
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
}
|
|
||||||
curr = curr.Add(dir)
|
|
||||||
}
|
|
||||||
pts = append(pts, curr) // appending end
|
|
||||||
|
|
||||||
// draw each point
|
|
||||||
for i, pt := range pts {
|
|
||||||
var prev, next geo.XY
|
|
||||||
switch {
|
|
||||||
case i == 0:
|
|
||||||
prev = pt.Add(dir.Inv())
|
|
||||||
next = pts[i+1]
|
|
||||||
case i == len(pts)-1:
|
|
||||||
prev = pts[i-1]
|
|
||||||
next = pt.Add(dir)
|
|
||||||
default:
|
|
||||||
prev, next = pts[i-1], pts[i+1]
|
|
||||||
}
|
|
||||||
b.SetPos(pt)
|
|
||||||
b.WriteRune(ls.Segment(prev.Sub(pt), next.Sub(pt)))
|
|
||||||
}
|
|
||||||
}
|
|
@ -1,108 +0,0 @@
|
|||||||
// Package terminal implements functionality related to interacting with a
|
|
||||||
// terminal. Using this package takes the place of using stdout directly
|
|
||||||
package terminal
|
|
||||||
|
|
||||||
import (
|
|
||||||
"bytes"
|
|
||||||
"fmt"
|
|
||||||
"io"
|
|
||||||
"os"
|
|
||||||
"syscall"
|
|
||||||
"unsafe"
|
|
||||||
|
|
||||||
"github.com/mediocregopher/ginger/gim/geo"
|
|
||||||
)
|
|
||||||
|
|
||||||
// Terminal provides an interface to a terminal which allows for "drawing"
|
|
||||||
// rather than just writing. Note that all operations on a Terminal aren't
|
|
||||||
// actually drawn to the screen until Flush is called.
|
|
||||||
//
|
|
||||||
// The coordinate system described by Terminal looks like this:
|
|
||||||
//
|
|
||||||
// 0,0 ------------------> x
|
|
||||||
// |
|
|
||||||
// |
|
|
||||||
// |
|
|
||||||
// |
|
|
||||||
// |
|
|
||||||
// |
|
|
||||||
// |
|
|
||||||
// |
|
|
||||||
// v
|
|
||||||
// y
|
|
||||||
//
|
|
||||||
type Terminal struct {
|
|
||||||
buf *bytes.Buffer
|
|
||||||
|
|
||||||
// When initialized this will be set to os.Stdout, but can be set to
|
|
||||||
// anything
|
|
||||||
Out io.Writer
|
|
||||||
}
|
|
||||||
|
|
||||||
// New initializes and returns a usable Terminal
|
|
||||||
func New() *Terminal {
|
|
||||||
return &Terminal{
|
|
||||||
buf: new(bytes.Buffer),
|
|
||||||
Out: os.Stdout,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// WindowSize returns the size of the terminal window (width/height)
|
|
||||||
// TODO this doesn't support winblows
|
|
||||||
func (t *Terminal) WindowSize() geo.XY {
|
|
||||||
var sz struct {
|
|
||||||
rows uint16
|
|
||||||
cols uint16
|
|
||||||
xpixels uint16
|
|
||||||
ypixels uint16
|
|
||||||
}
|
|
||||||
_, _, err := syscall.Syscall(
|
|
||||||
syscall.SYS_IOCTL,
|
|
||||||
uintptr(syscall.Stdin),
|
|
||||||
uintptr(syscall.TIOCGWINSZ),
|
|
||||||
uintptr(unsafe.Pointer(&sz)),
|
|
||||||
)
|
|
||||||
if err != 0 {
|
|
||||||
panic(err.Error())
|
|
||||||
}
|
|
||||||
return geo.XY{int(sz.cols), int(sz.rows)}
|
|
||||||
}
|
|
||||||
|
|
||||||
// SetPos sets the terminal's actual cursor position to the given coordinates.
|
|
||||||
func (t *Terminal) SetPos(to geo.XY) {
|
|
||||||
// actual terminal uses 1,1 as top-left, because 1-indexing is a great idea
|
|
||||||
fmt.Fprintf(t.buf, "\033[%d;%dH", to[1]+1, to[0]+1)
|
|
||||||
}
|
|
||||||
|
|
||||||
// HideCursor causes the cursor to not actually be shown
|
|
||||||
func (t *Terminal) HideCursor() {
|
|
||||||
fmt.Fprintf(t.buf, "\033[?25l")
|
|
||||||
}
|
|
||||||
|
|
||||||
// ShowCursor causes the cursor to be shown, if it was previously hidden
|
|
||||||
func (t *Terminal) ShowCursor() {
|
|
||||||
fmt.Fprintf(t.buf, "\033[?25h")
|
|
||||||
}
|
|
||||||
|
|
||||||
// Clear completely clears all drawn characters on the screen and returns the
|
|
||||||
// cursor to the origin. This implicitly calls Draw.
|
|
||||||
func (t *Terminal) Clear() {
|
|
||||||
t.buf.Reset()
|
|
||||||
fmt.Fprintf(t.buf, "\033[2J")
|
|
||||||
t.Draw()
|
|
||||||
}
|
|
||||||
|
|
||||||
// WriteBuffer writes the contents to the Buffer to the Terminal's buffer,
|
|
||||||
// starting at the given coordinate.
|
|
||||||
func (t *Terminal) WriteBuffer(at geo.XY, b *Buffer) {
|
|
||||||
t.SetPos(at)
|
|
||||||
t.buf.WriteString(b.String())
|
|
||||||
}
|
|
||||||
|
|
||||||
// Draw writes all buffered changes to the screen
|
|
||||||
func (t *Terminal) Draw() {
|
|
||||||
if _, err := io.Copy(t.Out, t.buf); err != nil {
|
|
||||||
panic(err)
|
|
||||||
}
|
|
||||||
t.buf.Reset()
|
|
||||||
}
|
|
@ -1,66 +0,0 @@
|
|||||||
package view
|
|
||||||
|
|
||||||
import (
|
|
||||||
"fmt"
|
|
||||||
|
|
||||||
"github.com/mediocregopher/ginger/gg"
|
|
||||||
"github.com/mediocregopher/ginger/gim/geo"
|
|
||||||
"github.com/mediocregopher/ginger/gim/terminal"
|
|
||||||
)
|
|
||||||
|
|
||||||
type box struct {
|
|
||||||
topLeft geo.XY
|
|
||||||
flowDir geo.XY
|
|
||||||
numIn, numOut int
|
|
||||||
buf *terminal.Buffer
|
|
||||||
bodyBuf *terminal.Buffer
|
|
||||||
}
|
|
||||||
|
|
||||||
func boxFromVertex(v *gg.Vertex, flowDir geo.XY) box {
|
|
||||||
b := box{
|
|
||||||
flowDir: flowDir,
|
|
||||||
numIn: len(v.In),
|
|
||||||
numOut: len(v.Out),
|
|
||||||
}
|
|
||||||
if v.VertexType == gg.ValueVertex {
|
|
||||||
b.bodyBuf = terminal.NewBuffer()
|
|
||||||
b.bodyBuf.WriteString(v.Value.V.(string))
|
|
||||||
}
|
|
||||||
return b
|
|
||||||
}
|
|
||||||
|
|
||||||
func (b box) rect() geo.Rect {
|
|
||||||
var bodyRect geo.Rect
|
|
||||||
if b.bodyBuf != nil {
|
|
||||||
bodyRect.Size = b.bodyBuf.Size().Add(geo.XY{2, 2})
|
|
||||||
}
|
|
||||||
|
|
||||||
var edgesRect geo.Rect
|
|
||||||
{
|
|
||||||
var neededByEdges int
|
|
||||||
if b.numIn > b.numOut {
|
|
||||||
neededByEdges = b.numIn*2 + 1
|
|
||||||
} else {
|
|
||||||
neededByEdges = b.numOut*2 + 1
|
|
||||||
}
|
|
||||||
|
|
||||||
switch b.flowDir {
|
|
||||||
case geo.Left, geo.Right:
|
|
||||||
edgesRect.Size = geo.XY{2, neededByEdges}
|
|
||||||
case geo.Up, geo.Down:
|
|
||||||
edgesRect.Size = geo.XY{neededByEdges, 2}
|
|
||||||
default:
|
|
||||||
panic(fmt.Sprintf("unknown flowDir: %#v", b.flowDir))
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return bodyRect.Union(edgesRect).Translate(b.topLeft)
|
|
||||||
}
|
|
||||||
|
|
||||||
func (b box) draw(buf *terminal.Buffer) {
|
|
||||||
rect := b.rect()
|
|
||||||
buf.DrawRect(rect, terminal.SingleLine)
|
|
||||||
if b.bodyBuf != nil {
|
|
||||||
buf.DrawBufferCentered(rect.Center(), b.bodyBuf)
|
|
||||||
}
|
|
||||||
}
|
|
@ -1,119 +0,0 @@
|
|||||||
// Package constraint implements an extremely simple constraint engine.
|
|
||||||
// Elements, and constraints on those elements, are given to the engine, which
|
|
||||||
// uses those constraints to generate an output. Elements are defined as a
|
|
||||||
// string
|
|
||||||
package constraint
|
|
||||||
|
|
||||||
import (
|
|
||||||
"github.com/mediocregopher/ginger/gg"
|
|
||||||
)
|
|
||||||
|
|
||||||
// Constraint describes a constraint on an element. The Elem field must be
|
|
||||||
// filled in, as well as exactly one other field
|
|
||||||
type Constraint struct {
|
|
||||||
Elem string
|
|
||||||
|
|
||||||
// LT says that Elem is less than this element
|
|
||||||
LT string
|
|
||||||
}
|
|
||||||
|
|
||||||
var ltEdge = gg.NewValue("lt")
|
|
||||||
|
|
||||||
// Engine processes sets of constraints to generate an output
|
|
||||||
type Engine struct {
|
|
||||||
g *gg.Graph
|
|
||||||
vals map[string]gg.Value
|
|
||||||
}
|
|
||||||
|
|
||||||
// NewEngine initializes and returns an empty Engine
|
|
||||||
func NewEngine() *Engine {
|
|
||||||
return &Engine{g: gg.Null, vals: map[string]gg.Value{}}
|
|
||||||
}
|
|
||||||
|
|
||||||
func (e *Engine) getVal(elem string) gg.Value {
|
|
||||||
if val, ok := e.vals[elem]; ok {
|
|
||||||
return val
|
|
||||||
}
|
|
||||||
val := gg.NewValue(elem)
|
|
||||||
e.vals[elem] = val
|
|
||||||
return val
|
|
||||||
}
|
|
||||||
|
|
||||||
// AddConstraint adds the given constraint to the engine's set, returns false if
|
|
||||||
// the constraint couldn't be added due to a conflict with a previous constraint
|
|
||||||
func (e *Engine) AddConstraint(c Constraint) bool {
|
|
||||||
elem := e.getVal(c.Elem)
|
|
||||||
g := e.g.AddValueIn(gg.ValueOut(elem, ltEdge), e.getVal(c.LT))
|
|
||||||
|
|
||||||
// Check for loops in g starting at c.Elem, bail if there are any
|
|
||||||
{
|
|
||||||
seen := map[*gg.Vertex]bool{}
|
|
||||||
start := g.ValueVertex(elem)
|
|
||||||
var hasLoop func(v *gg.Vertex) bool
|
|
||||||
hasLoop = func(v *gg.Vertex) bool {
|
|
||||||
if seen[v] {
|
|
||||||
return v == start
|
|
||||||
}
|
|
||||||
seen[v] = true
|
|
||||||
for _, out := range v.Out {
|
|
||||||
if hasLoop(out.To) {
|
|
||||||
return true
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
if hasLoop(start) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
e.g = g
|
|
||||||
return true
|
|
||||||
}
|
|
||||||
|
|
||||||
// Solve uses the constraints which have been added to the engine to give a
|
|
||||||
// possible solution. The given element is one which has been added to the
|
|
||||||
// engine and whose value is known to be zero.
|
|
||||||
func (e *Engine) Solve() map[string]int {
|
|
||||||
m := map[string]int{}
|
|
||||||
if len(e.g.ValueVertices()) == 0 {
|
|
||||||
return m
|
|
||||||
}
|
|
||||||
|
|
||||||
vElem := func(v *gg.Vertex) string {
|
|
||||||
return v.Value.V.(string)
|
|
||||||
}
|
|
||||||
|
|
||||||
// first the roots are determined to be the elements with no In edges, which
|
|
||||||
// _must_ exist since the graph presumably has no loops
|
|
||||||
var roots []*gg.Vertex
|
|
||||||
e.g.Iter(func(v *gg.Vertex) bool {
|
|
||||||
if len(v.In) == 0 {
|
|
||||||
roots = append(roots, v)
|
|
||||||
m[vElem(v)] = 0
|
|
||||||
}
|
|
||||||
return true
|
|
||||||
})
|
|
||||||
|
|
||||||
// sanity check
|
|
||||||
if len(roots) == 0 {
|
|
||||||
panic("no roots found in graph somehow")
|
|
||||||
}
|
|
||||||
|
|
||||||
// a vertex's value is then the length of the longest path from it to one of
|
|
||||||
// the roots
|
|
||||||
var walk func(*gg.Vertex, int)
|
|
||||||
walk = func(v *gg.Vertex, val int) {
|
|
||||||
if elem := vElem(v); val > m[elem] {
|
|
||||||
m[elem] = val
|
|
||||||
}
|
|
||||||
for _, out := range v.Out {
|
|
||||||
walk(out.To, val+1)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
for _, root := range roots {
|
|
||||||
walk(root, 0)
|
|
||||||
}
|
|
||||||
|
|
||||||
return m
|
|
||||||
}
|
|
@ -1,94 +0,0 @@
|
|||||||
package constraint
|
|
||||||
|
|
||||||
import (
|
|
||||||
. "testing"
|
|
||||||
|
|
||||||
"github.com/stretchr/testify/assert"
|
|
||||||
)
|
|
||||||
|
|
||||||
func TestEngineAddConstraint(t *T) {
|
|
||||||
{
|
|
||||||
e := NewEngine()
|
|
||||||
assert.True(t, e.AddConstraint(Constraint{Elem: "0", LT: "1"}))
|
|
||||||
assert.True(t, e.AddConstraint(Constraint{Elem: "1", LT: "2"}))
|
|
||||||
assert.True(t, e.AddConstraint(Constraint{Elem: "-1", LT: "0"}))
|
|
||||||
assert.False(t, e.AddConstraint(Constraint{Elem: "1", LT: "0"}))
|
|
||||||
assert.False(t, e.AddConstraint(Constraint{Elem: "2", LT: "0"}))
|
|
||||||
assert.False(t, e.AddConstraint(Constraint{Elem: "2", LT: "-1"}))
|
|
||||||
}
|
|
||||||
|
|
||||||
{
|
|
||||||
e := NewEngine()
|
|
||||||
assert.True(t, e.AddConstraint(Constraint{Elem: "0", LT: "1"}))
|
|
||||||
assert.True(t, e.AddConstraint(Constraint{Elem: "0", LT: "2"}))
|
|
||||||
assert.True(t, e.AddConstraint(Constraint{Elem: "1", LT: "2"}))
|
|
||||||
assert.True(t, e.AddConstraint(Constraint{Elem: "2", LT: "3"}))
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func TestEngineSolve(t *T) {
|
|
||||||
assertSolve := func(exp map[string]int, cc ...Constraint) {
|
|
||||||
e := NewEngine()
|
|
||||||
for _, c := range cc {
|
|
||||||
assert.True(t, e.AddConstraint(c), "c:%#v", c)
|
|
||||||
}
|
|
||||||
assert.Equal(t, exp, e.Solve())
|
|
||||||
}
|
|
||||||
|
|
||||||
// basic
|
|
||||||
assertSolve(
|
|
||||||
map[string]int{"a": 0, "b": 1, "c": 2},
|
|
||||||
Constraint{Elem: "a", LT: "b"},
|
|
||||||
Constraint{Elem: "b", LT: "c"},
|
|
||||||
)
|
|
||||||
|
|
||||||
// "triangle" graph
|
|
||||||
assertSolve(
|
|
||||||
map[string]int{"a": 0, "b": 1, "c": 2},
|
|
||||||
Constraint{Elem: "a", LT: "b"},
|
|
||||||
Constraint{Elem: "a", LT: "c"},
|
|
||||||
Constraint{Elem: "b", LT: "c"},
|
|
||||||
)
|
|
||||||
|
|
||||||
// "hexagon" graph
|
|
||||||
assertSolve(
|
|
||||||
map[string]int{"a": 0, "b": 1, "c": 1, "d": 2, "e": 2, "f": 3},
|
|
||||||
Constraint{Elem: "a", LT: "b"},
|
|
||||||
Constraint{Elem: "a", LT: "c"},
|
|
||||||
Constraint{Elem: "b", LT: "d"},
|
|
||||||
Constraint{Elem: "c", LT: "e"},
|
|
||||||
Constraint{Elem: "d", LT: "f"},
|
|
||||||
Constraint{Elem: "e", LT: "f"},
|
|
||||||
)
|
|
||||||
|
|
||||||
// "hexagon" with centerpoint graph
|
|
||||||
assertSolve(
|
|
||||||
map[string]int{"a": 0, "b": 1, "c": 1, "center": 2, "d": 3, "e": 3, "f": 4},
|
|
||||||
Constraint{Elem: "a", LT: "b"},
|
|
||||||
Constraint{Elem: "a", LT: "c"},
|
|
||||||
Constraint{Elem: "b", LT: "d"},
|
|
||||||
Constraint{Elem: "c", LT: "e"},
|
|
||||||
Constraint{Elem: "d", LT: "f"},
|
|
||||||
Constraint{Elem: "e", LT: "f"},
|
|
||||||
|
|
||||||
Constraint{Elem: "c", LT: "center"},
|
|
||||||
Constraint{Elem: "b", LT: "center"},
|
|
||||||
Constraint{Elem: "center", LT: "e"},
|
|
||||||
Constraint{Elem: "center", LT: "d"},
|
|
||||||
)
|
|
||||||
|
|
||||||
// multi-root, using two triangles which end up connecting
|
|
||||||
assertSolve(
|
|
||||||
map[string]int{"a": 0, "b": 1, "c": 2, "d": 0, "e": 1, "f": 2, "g": 3},
|
|
||||||
Constraint{Elem: "a", LT: "b"},
|
|
||||||
Constraint{Elem: "a", LT: "c"},
|
|
||||||
Constraint{Elem: "b", LT: "c"},
|
|
||||||
|
|
||||||
Constraint{Elem: "d", LT: "e"},
|
|
||||||
Constraint{Elem: "d", LT: "f"},
|
|
||||||
Constraint{Elem: "e", LT: "f"},
|
|
||||||
|
|
||||||
Constraint{Elem: "f", LT: "g"},
|
|
||||||
)
|
|
||||||
|
|
||||||
}
|
|
@ -1,31 +0,0 @@
|
|||||||
package view
|
|
||||||
|
|
||||||
import (
|
|
||||||
"github.com/mediocregopher/ginger/gim/geo"
|
|
||||||
"github.com/mediocregopher/ginger/gim/terminal"
|
|
||||||
)
|
|
||||||
|
|
||||||
type line struct {
|
|
||||||
from, to *box
|
|
||||||
fromI, toI int
|
|
||||||
bodyBuf *terminal.Buffer
|
|
||||||
}
|
|
||||||
|
|
||||||
func (l line) draw(buf *terminal.Buffer, flowDir, secFlowDir geo.XY) {
|
|
||||||
from, to := *(l.from), *(l.to)
|
|
||||||
start := from.rect().Edge(flowDir, secFlowDir)[0].Add(secFlowDir.Scale(l.fromI*2 + 1))
|
|
||||||
end := to.rect().Edge(flowDir.Inv(), secFlowDir)[0]
|
|
||||||
end = end.Add(flowDir.Inv())
|
|
||||||
end = end.Add(secFlowDir.Scale(l.toI*2 + 1))
|
|
||||||
|
|
||||||
buf.SetPos(start)
|
|
||||||
buf.WriteRune(terminal.SingleLine.Perpendicular(flowDir))
|
|
||||||
buf.DrawLine(start.Add(flowDir), end.Add(flowDir.Inv()), flowDir, terminal.SingleLine)
|
|
||||||
buf.SetPos(end)
|
|
||||||
buf.WriteRune(terminal.SingleLine.Arrow(flowDir))
|
|
||||||
|
|
||||||
// draw the body
|
|
||||||
if l.bodyBuf != nil {
|
|
||||||
buf.DrawBufferCentered(start.Midpoint(end), l.bodyBuf)
|
|
||||||
}
|
|
||||||
}
|
|
@ -1,27 +0,0 @@
|
|||||||
package view
|
|
||||||
|
|
||||||
import (
|
|
||||||
"github.com/mediocregopher/ginger/gim/geo"
|
|
||||||
"github.com/mediocregopher/ginger/gim/terminal"
|
|
||||||
)
|
|
||||||
|
|
||||||
type edge struct {
|
|
||||||
from, to *vertex
|
|
||||||
tail, head rune // if empty do directional segment char
|
|
||||||
body string
|
|
||||||
switchback bool
|
|
||||||
|
|
||||||
lineStyle terminal.LineStyle
|
|
||||||
}
|
|
||||||
|
|
||||||
type vertex struct {
|
|
||||||
coord, pos geo.XY
|
|
||||||
in, out [][]*edge // top level is port index
|
|
||||||
body string
|
|
||||||
|
|
||||||
// means it won't be drawn, and will be removed and have its in/out edges
|
|
||||||
// spliced together into a single edge.
|
|
||||||
ephemeral bool
|
|
||||||
|
|
||||||
lineStyle terminal.LineStyle // if zero value don't draw border
|
|
||||||
}
|
|
274
gim/view/view.go
274
gim/view/view.go
@ -1,274 +0,0 @@
|
|||||||
// Package view implements rendering a graph to a terminal.
|
|
||||||
//
|
|
||||||
// Steps for rendering
|
|
||||||
//
|
|
||||||
// - Preprocessing: Disjoin Graph into multiple Graphs, and decide how to
|
|
||||||
// arrange them (maybe sort by number of vertices or number of edges (or the
|
|
||||||
// sum of both) or something).
|
|
||||||
//
|
|
||||||
// - Convert Graph into internal representation.
|
|
||||||
// - Still uses gg.Graph, but vertices and edge values are wrapped in types
|
|
||||||
// internal to this package, and on which further mapping will be done.
|
|
||||||
// - Positions unknown at this point.
|
|
||||||
// - Junctions are converted to value vertices with set edge order.
|
|
||||||
// - Edges contain both their body and their tail/head rune.
|
|
||||||
//
|
|
||||||
// - Find eligible "root" vertex, probably by one which has the fewest input
|
|
||||||
// edges.
|
|
||||||
//
|
|
||||||
// - Find cycles and reverse edges as needed.
|
|
||||||
// - The to/from vertices are reversed, as are the head/tail runes, so the
|
|
||||||
// direction will appear consistent with the original graph
|
|
||||||
// - TODO this might not be necessary? Or at least may need to be modified.
|
|
||||||
// In the paper this is done, but that algorithm allows for edges upward
|
|
||||||
// from their tail, whereas this one doesn't. It might only be necessary
|
|
||||||
// for the MST stuff, in which case this might only need to take place
|
|
||||||
// within Positioning-Part1.
|
|
||||||
//
|
|
||||||
// - Replace edge bodies with a vertex with a single input/output edge.
|
|
||||||
//
|
|
||||||
// - Position all vertices
|
|
||||||
// - `coord` field on vertices used as row/column coordinates.
|
|
||||||
// - Positioning will be done with down being the primary direction and
|
|
||||||
// right being the secondary direction.
|
|
||||||
// - Part 1) find vertical positions for all vertices (aka assign rows)
|
|
||||||
// - This step uses some fancy MST stuff as outlined by (TODO refer to
|
|
||||||
// paper here).
|
|
||||||
// - Part 2) find horizontal positions within rows (aka assign columns)
|
|
||||||
// - Part of this will include creating ephemeral vertices where an
|
|
||||||
// edge spans a row without having a vertex on it. These will be
|
|
||||||
// removed as the final part of this step.
|
|
||||||
// - The jist of this step is to find vertex ordering which reduces
|
|
||||||
// number of edge crossings between adjacent rows.
|
|
||||||
// - Some extra care is taken for cases where an edge's from vertex is
|
|
||||||
// not a lower row than its to vertex.
|
|
||||||
// - This is an unavoidable case, as at the least a vertex may
|
|
||||||
// connect to itself.
|
|
||||||
// - These edges will have their `switchback` field set to true.
|
|
||||||
// - For the purposes of calculating edge crossings these edges
|
|
||||||
// should be ignored. During the absolute positioning and drawing
|
|
||||||
// steps they will be accounted for and dealt with.
|
|
||||||
// - Part 3) row/column positions into terminal positions, which are
|
|
||||||
// stored on the vertices in the `pos` field. Primary/secondary
|
|
||||||
// direction are taken into account here.
|
|
||||||
//
|
|
||||||
// - Post-processing: any additional absolute positioning and other formatting
|
|
||||||
// given by the user for the Graph should be done here
|
|
||||||
//
|
|
||||||
// - Draw vertices and their edges to buffer
|
|
||||||
// - At this point drawing vertices is easy. Edges is more complicated but
|
|
||||||
// the start/end positions of each edge should already be known, so while
|
|
||||||
// drawing may be complex it's not difficult.
|
|
||||||
//
|
|
||||||
package view
|
|
||||||
|
|
||||||
import (
|
|
||||||
"sort"
|
|
||||||
|
|
||||||
"github.com/mediocregopher/ginger/gg"
|
|
||||||
"github.com/mediocregopher/ginger/gim/geo"
|
|
||||||
"github.com/mediocregopher/ginger/gim/terminal"
|
|
||||||
"github.com/mediocregopher/ginger/gim/view/constraint"
|
|
||||||
)
|
|
||||||
|
|
||||||
// View wraps a single Graph instance and a set of display options for it, and
|
|
||||||
// generates renderable terminal output for it.
|
|
||||||
type View struct {
|
|
||||||
g *gg.Graph
|
|
||||||
start gg.Value // TODO shouldn't need this
|
|
||||||
|
|
||||||
primFlowDir, secFlowDir geo.XY
|
|
||||||
}
|
|
||||||
|
|
||||||
// New instantiates and returns a view around the given Graph instance, with
|
|
||||||
// start indicating the value vertex to consider the "root" of the graph.
|
|
||||||
//
|
|
||||||
// Drawing is done by aligning the vertices into rows and columns in such a way
|
|
||||||
// as to reduce edge crossings. primaryDir indicates the direction edges will
|
|
||||||
// primarily be pointed in. For example, if it is geo.Down then adjacent
|
|
||||||
// vertices will be arranged into columns.
|
|
||||||
//
|
|
||||||
// secondaryDir indicates the direction vertices should be arranged when they
|
|
||||||
// end up in the same "rank" (e.g. when primaryDir is geo.Down, all vertices on
|
|
||||||
// the same row will be the same "rank").
|
|
||||||
//
|
|
||||||
// A primaryDir/secondaryDir of either geo.Down/geo.Right or geo.Right/geo.Down
|
|
||||||
// are recommended, but any combination of perpendicular directions is allowed.
|
|
||||||
func New(g *gg.Graph, start gg.Value, primaryDir, secondaryDir geo.XY) *View {
|
|
||||||
return &View{
|
|
||||||
g: g,
|
|
||||||
start: start,
|
|
||||||
primFlowDir: primaryDir,
|
|
||||||
secFlowDir: secondaryDir,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Draw renders and draws the View's Graph to the Buffer.
|
|
||||||
func (view *View) Draw(buf *terminal.Buffer) {
|
|
||||||
relPos, _, secSol := posSolve(view.g)
|
|
||||||
|
|
||||||
// create boxes
|
|
||||||
var boxes []*box
|
|
||||||
boxesM := map[*box]*gg.Vertex{}
|
|
||||||
boxesMr := map[*gg.Vertex]*box{}
|
|
||||||
const (
|
|
||||||
primPadding = 5
|
|
||||||
secPadding = 1
|
|
||||||
)
|
|
||||||
var primPos int
|
|
||||||
for _, vv := range relPos {
|
|
||||||
var primBoxes []*box // boxes on just this level
|
|
||||||
var maxPrim int
|
|
||||||
var secPos int
|
|
||||||
for _, v := range vv {
|
|
||||||
primVec := view.primFlowDir.Scale(primPos)
|
|
||||||
secVec := view.secFlowDir.Scale(secPos)
|
|
||||||
|
|
||||||
b := boxFromVertex(v, view.primFlowDir)
|
|
||||||
b.topLeft = primVec.Add(secVec)
|
|
||||||
boxes = append(boxes, &b)
|
|
||||||
primBoxes = append(primBoxes, &b)
|
|
||||||
boxesM[&b] = v
|
|
||||||
boxesMr[v] = &b
|
|
||||||
|
|
||||||
bSize := b.rect().Size
|
|
||||||
primBoxLen := bSize.Mul(view.primFlowDir).Len()
|
|
||||||
secBoxLen := bSize.Mul(view.secFlowDir).Len()
|
|
||||||
if primBoxLen > maxPrim {
|
|
||||||
maxPrim = primBoxLen
|
|
||||||
}
|
|
||||||
secPos += secBoxLen + secPadding
|
|
||||||
}
|
|
||||||
for _, b := range primBoxes {
|
|
||||||
b.topLeft = b.topLeft.Add(view.primFlowDir.Scale(primPos))
|
|
||||||
}
|
|
||||||
primPos += maxPrim + primPadding
|
|
||||||
}
|
|
||||||
|
|
||||||
// maps a vertex to all of its to edges, sorted by secSol
|
|
||||||
findFromIM := map[*gg.Vertex][]gg.Edge{}
|
|
||||||
// returns the index of this edge in from's Out
|
|
||||||
findFromI := func(from *gg.Vertex, e gg.Edge) int {
|
|
||||||
edges, ok := findFromIM[from]
|
|
||||||
if !ok {
|
|
||||||
edges = make([]gg.Edge, len(from.Out))
|
|
||||||
copy(edges, from.Out)
|
|
||||||
sort.Slice(edges, func(i, j int) bool {
|
|
||||||
// TODO if two edges go to the same vertex, how are they sorted?
|
|
||||||
return secSol[edges[i].To.ID] < secSol[edges[j].To.ID]
|
|
||||||
})
|
|
||||||
findFromIM[from] = edges
|
|
||||||
}
|
|
||||||
|
|
||||||
for i, fe := range edges {
|
|
||||||
if fe == e {
|
|
||||||
return i
|
|
||||||
}
|
|
||||||
}
|
|
||||||
panic("edge not found in from.Out")
|
|
||||||
}
|
|
||||||
|
|
||||||
// create lines
|
|
||||||
var lines []line
|
|
||||||
for _, b := range boxes {
|
|
||||||
v := boxesM[b]
|
|
||||||
for i, e := range v.In {
|
|
||||||
bFrom := boxesMr[e.From]
|
|
||||||
fromI := findFromI(e.From, e)
|
|
||||||
buf := terminal.NewBuffer()
|
|
||||||
buf.WriteString(e.Value.V.(string))
|
|
||||||
lines = append(lines, line{
|
|
||||||
from: bFrom,
|
|
||||||
fromI: fromI,
|
|
||||||
to: b,
|
|
||||||
toI: i,
|
|
||||||
bodyBuf: buf,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// actually draw the boxes and lines
|
|
||||||
for _, b := range boxes {
|
|
||||||
b.draw(buf)
|
|
||||||
}
|
|
||||||
for _, line := range lines {
|
|
||||||
line.draw(buf, view.primFlowDir, view.secFlowDir)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// "Solves" vertex position by detemining relative positions of vertices in
|
|
||||||
// primary and secondary directions (independently), with relative positions
|
|
||||||
// being described by "levels", where multiple vertices can occupy one level.
|
|
||||||
//
|
|
||||||
// Primary determines relative position in the primary direction by trying
|
|
||||||
// to place vertices before their outs and after their ins.
|
|
||||||
//
|
|
||||||
// Secondary determines relative position in the secondary direction by
|
|
||||||
// trying to place vertices relative to vertices they share an edge with in
|
|
||||||
// the order that the edges appear on the shared node.
|
|
||||||
func posSolve(g *gg.Graph) ([][]*gg.Vertex, map[string]int, map[string]int) {
|
|
||||||
primEng := constraint.NewEngine()
|
|
||||||
secEng := constraint.NewEngine()
|
|
||||||
|
|
||||||
strM := g.ByID()
|
|
||||||
for _, v := range strM {
|
|
||||||
var prevIn *gg.Vertex
|
|
||||||
for _, e := range v.In {
|
|
||||||
primEng.AddConstraint(constraint.Constraint{
|
|
||||||
Elem: e.From.ID,
|
|
||||||
LT: v.ID,
|
|
||||||
})
|
|
||||||
if prevIn != nil {
|
|
||||||
secEng.AddConstraint(constraint.Constraint{
|
|
||||||
Elem: prevIn.ID,
|
|
||||||
LT: e.From.ID,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
prevIn = e.From
|
|
||||||
}
|
|
||||||
|
|
||||||
var prevOut *gg.Vertex
|
|
||||||
for _, e := range v.Out {
|
|
||||||
if prevOut == nil {
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
secEng.AddConstraint(constraint.Constraint{
|
|
||||||
Elem: prevOut.ID,
|
|
||||||
LT: e.To.ID,
|
|
||||||
})
|
|
||||||
prevOut = e.To
|
|
||||||
}
|
|
||||||
}
|
|
||||||
prim := primEng.Solve()
|
|
||||||
sec := secEng.Solve()
|
|
||||||
|
|
||||||
// determine maximum primary level
|
|
||||||
var maxPrim int
|
|
||||||
for _, lvl := range prim {
|
|
||||||
if lvl > maxPrim {
|
|
||||||
maxPrim = lvl
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
outStr := make([][]string, maxPrim+1)
|
|
||||||
for v, lvl := range prim {
|
|
||||||
outStr[lvl] = append(outStr[lvl], v)
|
|
||||||
}
|
|
||||||
|
|
||||||
// sort each primary level
|
|
||||||
for _, vv := range outStr {
|
|
||||||
sort.Slice(vv, func(i, j int) bool {
|
|
||||||
return sec[vv[i]] < sec[vv[j]]
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
// convert to vertices
|
|
||||||
out := make([][]*gg.Vertex, len(outStr))
|
|
||||||
for i, vv := range outStr {
|
|
||||||
out[i] = make([]*gg.Vertex, len(outStr[i]))
|
|
||||||
for j, v := range vv {
|
|
||||||
out[i][j] = strM[v]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return out, prim, sec
|
|
||||||
}
|
|
14
go.mod
Normal file
14
go.mod
Normal file
@ -0,0 +1,14 @@
|
|||||||
|
module code.betamike.com/mediocregopher/ginger
|
||||||
|
|
||||||
|
go 1.18
|
||||||
|
|
||||||
|
require (
|
||||||
|
github.com/stretchr/testify v1.7.0
|
||||||
|
golang.org/x/exp v0.0.0-20231006140011-7918f672742d
|
||||||
|
)
|
||||||
|
|
||||||
|
require (
|
||||||
|
github.com/davecgh/go-spew v1.1.0 // indirect
|
||||||
|
github.com/pmezard/go-difflib v1.0.0 // indirect
|
||||||
|
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c // indirect
|
||||||
|
)
|
13
go.sum
Normal file
13
go.sum
Normal file
@ -0,0 +1,13 @@
|
|||||||
|
github.com/davecgh/go-spew v1.1.0 h1:ZDRjVQ15GmhC3fiQ8ni8+OwkZQO4DARzQgrnXU1Liz8=
|
||||||
|
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||||
|
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
|
||||||
|
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
|
||||||
|
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
|
||||||
|
github.com/stretchr/testify v1.7.0 h1:nwc3DEeHmmLAfoZucVR881uASk0Mfjw8xYJ99tb5CcY=
|
||||||
|
github.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
|
||||||
|
golang.org/x/exp v0.0.0-20231006140011-7918f672742d h1:jtJma62tbqLibJ5sFQz8bKtEM8rJBtfilJ2qTU199MI=
|
||||||
|
golang.org/x/exp v0.0.0-20231006140011-7918f672742d/go.mod h1:ldy0pHrwJyGW56pPQzzkH36rKxoZW1tw7ZJpeKx+hdo=
|
||||||
|
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405 h1:yhCVgyC4o1eVCa2tZl7eS0r+SDo693bJlVdllGtEeKM=
|
||||||
|
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
|
||||||
|
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c h1:dUUwHk2QECo/6vqA44rthZ8ie2QXMNeKRTHCNY2nXvo=
|
||||||
|
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
|
432
graph/graph.go
Normal file
432
graph/graph.go
Normal file
@ -0,0 +1,432 @@
|
|||||||
|
// Package graph implements a generic directed graph type, with support for
|
||||||
|
// tuple vertices in addition to traditional "value" vertices.
|
||||||
|
package graph
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Value is any value which can be stored within a Graph. Values should be
|
||||||
|
// considered immutable, ie once used with the graph package their internal
|
||||||
|
// value does not change.
|
||||||
|
type Value interface {
|
||||||
|
Equal(Value) bool
|
||||||
|
String() string
|
||||||
|
}
|
||||||
|
|
||||||
|
// OpenEdge consists of the edge value (E) and source vertex value (V) of an
|
||||||
|
// edge in a Graph. When passed into the AddValueIn method a full edge is
|
||||||
|
// created. An OpenEdge can also be sourced from a tuple vertex, whose value is
|
||||||
|
// an ordered set of OpenEdges of this same type.
|
||||||
|
type OpenEdge[E, V Value] struct {
|
||||||
|
val *V
|
||||||
|
tup []*OpenEdge[E, V]
|
||||||
|
|
||||||
|
edgeVal E
|
||||||
|
}
|
||||||
|
|
||||||
|
func (oe *OpenEdge[E, V]) equal(oe2 *OpenEdge[E, V]) bool {
|
||||||
|
if !oe.edgeVal.Equal(oe2.edgeVal) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if oe.val != nil {
|
||||||
|
return oe2.val != nil && (*oe.val).Equal(*oe2.val)
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(oe.tup) != len(oe2.tup) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
for i := range oe.tup {
|
||||||
|
if !oe.tup[i].equal(oe2.tup[i]) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
func (oe *OpenEdge[E, V]) String() string {
|
||||||
|
|
||||||
|
vertexType := "tup"
|
||||||
|
|
||||||
|
var fromStr string
|
||||||
|
|
||||||
|
if oe.val != nil {
|
||||||
|
|
||||||
|
vertexType = "val"
|
||||||
|
fromStr = (*oe.val).String()
|
||||||
|
|
||||||
|
} else {
|
||||||
|
|
||||||
|
strs := make([]string, len(oe.tup))
|
||||||
|
|
||||||
|
for i := range oe.tup {
|
||||||
|
strs[i] = oe.tup[i].String()
|
||||||
|
}
|
||||||
|
|
||||||
|
fromStr = fmt.Sprintf("[%s]", strings.Join(strs, ", "))
|
||||||
|
}
|
||||||
|
|
||||||
|
return fmt.Sprintf("%s(%s, %s)", vertexType, fromStr, oe.edgeVal.String())
|
||||||
|
}
|
||||||
|
|
||||||
|
// WithEdgeValue returns a copy of the OpenEdge with the given Value replacing
|
||||||
|
// the previous edge value.
|
||||||
|
//
|
||||||
|
// NOTE I _think_ this can be factored out once Graph is genericized.
|
||||||
|
func (oe *OpenEdge[E, V]) WithEdgeValue(val E) *OpenEdge[E, V] {
|
||||||
|
oeCp := *oe
|
||||||
|
oeCp.edgeVal = val
|
||||||
|
return &oeCp
|
||||||
|
}
|
||||||
|
|
||||||
|
// EdgeValue returns the Value which lies on the edge itself.
|
||||||
|
func (oe OpenEdge[E, V]) EdgeValue() E {
|
||||||
|
return oe.edgeVal
|
||||||
|
}
|
||||||
|
|
||||||
|
// FromValue returns the Value from which the OpenEdge was created via ValueOut,
|
||||||
|
// or false if it wasn't created via ValueOut.
|
||||||
|
func (oe OpenEdge[E, V]) FromValue() (V, bool) {
|
||||||
|
if oe.val == nil {
|
||||||
|
var zero V
|
||||||
|
return zero, false
|
||||||
|
}
|
||||||
|
|
||||||
|
return *oe.val, true
|
||||||
|
}
|
||||||
|
|
||||||
|
// FromTuple returns the tuple of OpenEdges from which the OpenEdge was created
|
||||||
|
// via TupleOut, or false if it wasn't created via TupleOut.
|
||||||
|
func (oe OpenEdge[E, V]) FromTuple() ([]*OpenEdge[E, V], bool) {
|
||||||
|
if oe.val != nil {
|
||||||
|
return nil, false
|
||||||
|
}
|
||||||
|
|
||||||
|
return oe.tup, true
|
||||||
|
}
|
||||||
|
|
||||||
|
// ValueOut creates a OpenEdge which, when used to construct a Graph, represents
|
||||||
|
// an edge (with edgeVal attached to it) coming from the vertex containing val.
|
||||||
|
func ValueOut[E, V Value](edgeVal E, val V) *OpenEdge[E, V] {
|
||||||
|
return &OpenEdge[E, V]{
|
||||||
|
val: &val,
|
||||||
|
edgeVal: edgeVal,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// TupleOut creates an OpenEdge which, when used to construct a Graph,
|
||||||
|
// represents an edge (with edgeVal attached to it) coming from the vertex
|
||||||
|
// comprised of the given ordered-set of input edges.
|
||||||
|
func TupleOut[E, V Value](edgeVal E, ins ...*OpenEdge[E, V]) *OpenEdge[E, V] {
|
||||||
|
|
||||||
|
if len(ins) == 1 {
|
||||||
|
|
||||||
|
var (
|
||||||
|
zero E
|
||||||
|
in = ins[0]
|
||||||
|
)
|
||||||
|
|
||||||
|
if edgeVal.Equal(zero) {
|
||||||
|
return in
|
||||||
|
}
|
||||||
|
|
||||||
|
if in.edgeVal.Equal(zero) {
|
||||||
|
return in.WithEdgeValue(edgeVal)
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
return &OpenEdge[E, V]{
|
||||||
|
tup: ins,
|
||||||
|
edgeVal: edgeVal,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
type graphValueIn[E, V Value] struct {
|
||||||
|
val V
|
||||||
|
edge *OpenEdge[E, V]
|
||||||
|
}
|
||||||
|
|
||||||
|
func (valIn graphValueIn[E, V]) equal(valIn2 graphValueIn[E, V]) bool {
|
||||||
|
return valIn.val.Equal(valIn2.val) && valIn.edge.equal(valIn2.edge)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Graph is an immutable container of a set of vertices. The Graph keeps track
|
||||||
|
// of all Values which terminate an OpenEdge. E indicates the type of edge
|
||||||
|
// values, while V indicates the type of vertex values.
|
||||||
|
//
|
||||||
|
// NOTE The current implementation of Graph is incredibly inefficient, there's
|
||||||
|
// lots of O(N) operations, unnecessary copying on changes, and duplicate data
|
||||||
|
// in memory.
|
||||||
|
type Graph[E, V Value] struct {
|
||||||
|
edges []*OpenEdge[E, V]
|
||||||
|
valIns []graphValueIn[E, V]
|
||||||
|
}
|
||||||
|
|
||||||
|
func (g *Graph[E, V]) cp() *Graph[E, V] {
|
||||||
|
cp := &Graph[E, V]{
|
||||||
|
edges: make([]*OpenEdge[E, V], len(g.edges)),
|
||||||
|
valIns: make([]graphValueIn[E, V], len(g.valIns)),
|
||||||
|
}
|
||||||
|
copy(cp.edges, g.edges)
|
||||||
|
copy(cp.valIns, g.valIns)
|
||||||
|
return cp
|
||||||
|
}
|
||||||
|
|
||||||
|
func (g *Graph[E, V]) String() string {
|
||||||
|
|
||||||
|
var strs []string
|
||||||
|
|
||||||
|
for _, valIn := range g.valIns {
|
||||||
|
strs = append(
|
||||||
|
strs,
|
||||||
|
fmt.Sprintf("valIn(%s, %s)", valIn.edge.String(), valIn.val.String()),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
return fmt.Sprintf("graph(%s)", strings.Join(strs, ", "))
|
||||||
|
}
|
||||||
|
|
||||||
|
// NOTE this method is used more for its functionality than for any performance
|
||||||
|
// reasons... it's incredibly inefficient in how it deduplicates edges, but by
|
||||||
|
// doing the deduplication we enable the graph map operation to work correctly.
|
||||||
|
func (g *Graph[E, V]) dedupeEdge(edge *OpenEdge[E, V]) *OpenEdge[E, V] {
|
||||||
|
|
||||||
|
// check if there's an existing edge which is fully equivalent in the graph
|
||||||
|
// already, and if so return that.
|
||||||
|
for i := range g.edges {
|
||||||
|
if g.edges[i].equal(edge) {
|
||||||
|
return g.edges[i]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// if this edge is a value edge then there's nothing else to do, return it.
|
||||||
|
if _, ok := edge.FromValue(); ok {
|
||||||
|
return edge
|
||||||
|
}
|
||||||
|
|
||||||
|
// this edge is a tuple edge, it's possible that one of its sub-edges is
|
||||||
|
// already in the graph. dedupe each sub-edge individually.
|
||||||
|
|
||||||
|
tupEdges := make([]*OpenEdge[E, V], len(edge.tup))
|
||||||
|
|
||||||
|
for i := range edge.tup {
|
||||||
|
tupEdges[i] = g.dedupeEdge(edge.tup[i])
|
||||||
|
}
|
||||||
|
|
||||||
|
return TupleOut(edge.EdgeValue(), tupEdges...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ValueIns returns, if any, all OpenEdges which lead to the given Value in the
|
||||||
|
// Graph (ie, all those added via AddValueIn).
|
||||||
|
//
|
||||||
|
// The returned slice should not be modified.
|
||||||
|
//
|
||||||
|
// TODO better name might be OpenEdgesInto.
|
||||||
|
func (g *Graph[E, V]) ValueIns(val Value) []*OpenEdge[E, V] {
|
||||||
|
|
||||||
|
var edges []*OpenEdge[E, V]
|
||||||
|
|
||||||
|
for _, valIn := range g.valIns {
|
||||||
|
if valIn.val.Equal(val) {
|
||||||
|
edges = append(edges, valIn.edge)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return edges
|
||||||
|
}
|
||||||
|
|
||||||
|
// AddValueIn takes a OpenEdge and connects it to the Value vertex containing
|
||||||
|
// val, returning the new Graph which reflects that connection.
|
||||||
|
func (g *Graph[E, V]) AddValueIn(val V, oe *OpenEdge[E, V]) *Graph[E, V] {
|
||||||
|
|
||||||
|
valIn := graphValueIn[E, V]{
|
||||||
|
val: val,
|
||||||
|
edge: oe,
|
||||||
|
}
|
||||||
|
|
||||||
|
for i := range g.valIns {
|
||||||
|
if g.valIns[i].equal(valIn) {
|
||||||
|
return g
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
valIn.edge = g.dedupeEdge(valIn.edge)
|
||||||
|
|
||||||
|
g = g.cp()
|
||||||
|
g.valIns = append(g.valIns, valIn)
|
||||||
|
|
||||||
|
return g
|
||||||
|
}
|
||||||
|
|
||||||
|
// AllValueIns returns all values which have had incoming edges added to them
|
||||||
|
// using AddValueIn.
|
||||||
|
func (g *Graph[E, V]) AllValueIns() []V {
|
||||||
|
vals := make([]V, len(g.valIns))
|
||||||
|
for i := range g.valIns {
|
||||||
|
vals[i] = g.valIns[i].val
|
||||||
|
}
|
||||||
|
return vals
|
||||||
|
}
|
||||||
|
|
||||||
|
// Equal returns whether or not the two Graphs are equivalent in value.
|
||||||
|
func (g *Graph[E, V]) Equal(g2 *Graph[E, V]) bool {
|
||||||
|
|
||||||
|
if len(g.valIns) != len(g2.valIns) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
outer:
|
||||||
|
for _, valIn := range g.valIns {
|
||||||
|
|
||||||
|
for _, valIn2 := range g2.valIns {
|
||||||
|
|
||||||
|
if valIn.equal(valIn2) {
|
||||||
|
continue outer
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
func mapReduce[Ea, Va Value, Vb any](
|
||||||
|
root *OpenEdge[Ea, Va],
|
||||||
|
mapVal func(Va) (Vb, error),
|
||||||
|
reduceEdge func(*OpenEdge[Ea, Va], []Vb) (Vb, error),
|
||||||
|
) (
|
||||||
|
Vb, error,
|
||||||
|
) {
|
||||||
|
|
||||||
|
if valA, ok := root.FromValue(); ok {
|
||||||
|
|
||||||
|
valB, err := mapVal(valA)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
var zero Vb
|
||||||
|
return zero, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return reduceEdge(root, []Vb{valB})
|
||||||
|
}
|
||||||
|
|
||||||
|
tupA, _ := root.FromTuple()
|
||||||
|
|
||||||
|
valsB := make([]Vb, len(tupA))
|
||||||
|
|
||||||
|
for i := range tupA {
|
||||||
|
|
||||||
|
valB, err := mapReduce[Ea, Va, Vb](
|
||||||
|
tupA[i], mapVal, reduceEdge,
|
||||||
|
)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
var zero Vb
|
||||||
|
return zero, err
|
||||||
|
}
|
||||||
|
|
||||||
|
valsB[i] = valB
|
||||||
|
}
|
||||||
|
|
||||||
|
return reduceEdge(root, valsB)
|
||||||
|
}
|
||||||
|
|
||||||
|
type mappedVal[Va Value, Vb any] struct {
|
||||||
|
valA Va
|
||||||
|
valB Vb // result
|
||||||
|
}
|
||||||
|
|
||||||
|
type reducedEdge[Ea, Va Value, Vb any] struct {
|
||||||
|
edgeA *OpenEdge[Ea, Va]
|
||||||
|
valB Vb // result
|
||||||
|
}
|
||||||
|
|
||||||
|
// MapReduce recursively computes a resultant Value of type Vb from an
|
||||||
|
// OpenEdge[Ea, Va].
|
||||||
|
//
|
||||||
|
// Tuple edges which are encountered will have Reduce called on each OpenEdge
|
||||||
|
// branch of the tuple, to obtain a Vb for each branch. The edge value of the
|
||||||
|
// tuple edge (Ea) and the just obtained Vbs are then passed to reduceEdge to
|
||||||
|
// obtain a Vb for that edge.
|
||||||
|
//
|
||||||
|
// The values of value edges (Va) which are encountered are mapped to Vb using
|
||||||
|
// the mapVal function. The edge value of those value edges (Ea) and the just
|
||||||
|
// obtained Vb value are then passed to reduceEdge to obtain a Vb for that edge.
|
||||||
|
//
|
||||||
|
// If either the map or reduce function returns an error then processing is
|
||||||
|
// immediately cancelled and that error is returned directly.
|
||||||
|
//
|
||||||
|
// If a value or edge is connected to multiple times within the root OpenEdge it
|
||||||
|
// will only be mapped/reduced a single time, and the result of that single
|
||||||
|
// map/reduction will be passed to each dependant operation.
|
||||||
|
func MapReduce[Ea, Va Value, Vb any](
|
||||||
|
root *OpenEdge[Ea, Va],
|
||||||
|
mapVal func(Va) (Vb, error),
|
||||||
|
reduceEdge func(Ea, []Vb) (Vb, error),
|
||||||
|
) (
|
||||||
|
Vb, error,
|
||||||
|
) {
|
||||||
|
|
||||||
|
var (
|
||||||
|
zeroB Vb
|
||||||
|
|
||||||
|
// we use these to memoize reductions on values and edges, so a
|
||||||
|
// reduction is only performed a single time for each value/edge.
|
||||||
|
//
|
||||||
|
// NOTE this is not implemented very efficiently.
|
||||||
|
mappedVals []mappedVal[Va, Vb]
|
||||||
|
reducedEdges []reducedEdge[Ea, Va, Vb]
|
||||||
|
)
|
||||||
|
|
||||||
|
return mapReduce[Ea, Va, Vb](
|
||||||
|
root,
|
||||||
|
func(valA Va) (Vb, error) {
|
||||||
|
|
||||||
|
for _, mappedVal := range mappedVals {
|
||||||
|
if mappedVal.valA.Equal(valA) {
|
||||||
|
return mappedVal.valB, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
valB, err := mapVal(valA)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
return zeroB, err
|
||||||
|
}
|
||||||
|
|
||||||
|
mappedVals = append(mappedVals, mappedVal[Va, Vb]{
|
||||||
|
valA: valA,
|
||||||
|
valB: valB,
|
||||||
|
})
|
||||||
|
|
||||||
|
return valB, nil
|
||||||
|
},
|
||||||
|
func(edgeA *OpenEdge[Ea, Va], valBs []Vb) (Vb, error) {
|
||||||
|
|
||||||
|
for _, reducedEdge := range reducedEdges {
|
||||||
|
if reducedEdge.edgeA.equal(edgeA) {
|
||||||
|
return reducedEdge.valB, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
valB, err := reduceEdge(edgeA.EdgeValue(), valBs)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
return zeroB, err
|
||||||
|
}
|
||||||
|
|
||||||
|
reducedEdges = append(reducedEdges, reducedEdge[Ea, Va, Vb]{
|
||||||
|
edgeA: edgeA,
|
||||||
|
valB: valB,
|
||||||
|
})
|
||||||
|
|
||||||
|
return valB, nil
|
||||||
|
},
|
||||||
|
)
|
||||||
|
}
|
268
graph/graph_test.go
Normal file
268
graph/graph_test.go
Normal file
@ -0,0 +1,268 @@
|
|||||||
|
package graph
|
||||||
|
|
||||||
|
import (
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"strconv"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
|
)
|
||||||
|
|
||||||
|
type S string
|
||||||
|
|
||||||
|
func (s S) Equal(s2 Value) bool { return s == s2.(S) }
|
||||||
|
|
||||||
|
func (s S) String() string { return string(s) }
|
||||||
|
|
||||||
|
type I int
|
||||||
|
|
||||||
|
func (i I) Equal(i2 Value) bool { return i == i2.(I) }
|
||||||
|
|
||||||
|
func (i I) String() string { return strconv.Itoa(int(i)) }
|
||||||
|
|
||||||
|
func TestEqual(t *testing.T) {
|
||||||
|
|
||||||
|
var (
|
||||||
|
zeroValue S
|
||||||
|
zeroGraph = new(Graph[S, S])
|
||||||
|
)
|
||||||
|
|
||||||
|
tests := []struct {
|
||||||
|
a, b *Graph[S, S]
|
||||||
|
exp bool
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
a: zeroGraph,
|
||||||
|
b: zeroGraph,
|
||||||
|
exp: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
a: zeroGraph,
|
||||||
|
b: zeroGraph.AddValueIn("out", ValueOut[S, S]("incr", "in")),
|
||||||
|
exp: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
a: zeroGraph.AddValueIn("out", ValueOut[S, S]("incr", "in")),
|
||||||
|
b: zeroGraph.AddValueIn("out", ValueOut[S, S]("incr", "in")),
|
||||||
|
exp: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
a: zeroGraph.AddValueIn("out", ValueOut[S, S]("incr", "in")),
|
||||||
|
b: zeroGraph.AddValueIn("out", TupleOut[S, S](
|
||||||
|
"add",
|
||||||
|
ValueOut[S, S]("ident", "in"),
|
||||||
|
ValueOut[S, S]("ident", "1"),
|
||||||
|
)),
|
||||||
|
exp: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
// tuples are different order
|
||||||
|
a: zeroGraph.AddValueIn("out", TupleOut[S, S](
|
||||||
|
"add",
|
||||||
|
ValueOut[S, S]("ident", "1"),
|
||||||
|
ValueOut[S, S]("ident", "in"),
|
||||||
|
)),
|
||||||
|
b: zeroGraph.AddValueIn("out", TupleOut[S, S](
|
||||||
|
"add",
|
||||||
|
ValueOut[S, S]("ident", "in"),
|
||||||
|
ValueOut[S, S]("ident", "1"),
|
||||||
|
)),
|
||||||
|
exp: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
// tuple with no edge value and just a single input edge should be
|
||||||
|
// equivalent to just that edge.
|
||||||
|
a: zeroGraph.AddValueIn("out", TupleOut[S, S](
|
||||||
|
zeroValue,
|
||||||
|
ValueOut[S, S]("ident", "1"),
|
||||||
|
)),
|
||||||
|
b: zeroGraph.AddValueIn("out", ValueOut[S, S]("ident", "1")),
|
||||||
|
exp: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
// tuple with an edge value and just a single input edge that has no
|
||||||
|
// edgeVal should be equivalent to just that edge with the tuple's
|
||||||
|
// edge value.
|
||||||
|
a: zeroGraph.AddValueIn("out", TupleOut[S, S](
|
||||||
|
"ident",
|
||||||
|
ValueOut[S, S](zeroValue, "1"),
|
||||||
|
)),
|
||||||
|
b: zeroGraph.AddValueIn("out", ValueOut[S, S]("ident", "1")),
|
||||||
|
exp: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
a: zeroGraph.
|
||||||
|
AddValueIn("out", ValueOut[S, S]("incr", "in")).
|
||||||
|
AddValueIn("out2", ValueOut[S, S]("incr2", "in2")),
|
||||||
|
b: zeroGraph.
|
||||||
|
AddValueIn("out", ValueOut[S, S]("incr", "in")),
|
||||||
|
exp: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
a: zeroGraph.
|
||||||
|
AddValueIn("out", ValueOut[S, S]("incr", "in")).
|
||||||
|
AddValueIn("out2", ValueOut[S, S]("incr2", "in2")),
|
||||||
|
b: zeroGraph.
|
||||||
|
AddValueIn("out", ValueOut[S, S]("incr", "in")).
|
||||||
|
AddValueIn("out2", ValueOut[S, S]("incr2", "in2")),
|
||||||
|
exp: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
// order of value ins shouldn't matter
|
||||||
|
a: zeroGraph.
|
||||||
|
AddValueIn("out", ValueOut[S, S]("incr", "in")).
|
||||||
|
AddValueIn("out2", ValueOut[S, S]("incr2", "in2")),
|
||||||
|
b: zeroGraph.
|
||||||
|
AddValueIn("out2", ValueOut[S, S]("incr2", "in2")).
|
||||||
|
AddValueIn("out", ValueOut[S, S]("incr", "in")),
|
||||||
|
exp: true,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for i, test := range tests {
|
||||||
|
t.Run(strconv.Itoa(i), func(t *testing.T) {
|
||||||
|
assert.Equal(t, test.exp, test.a.Equal(test.b))
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
type mapReduceTestEdge struct {
|
||||||
|
name string
|
||||||
|
fn func([]int) int
|
||||||
|
done bool
|
||||||
|
}
|
||||||
|
|
||||||
|
func (e *mapReduceTestEdge) Equal(e2i Value) bool {
|
||||||
|
|
||||||
|
e2, _ := e2i.(*mapReduceTestEdge)
|
||||||
|
|
||||||
|
if e == nil || e2 == nil {
|
||||||
|
return e == e2
|
||||||
|
}
|
||||||
|
|
||||||
|
return e.name == e2.name
|
||||||
|
}
|
||||||
|
|
||||||
|
func (e *mapReduceTestEdge) String() string {
|
||||||
|
return e.name
|
||||||
|
}
|
||||||
|
|
||||||
|
func (e *mapReduceTestEdge) do(ii []int) int {
|
||||||
|
|
||||||
|
if e.done {
|
||||||
|
panic(fmt.Sprintf("%q already done", e.name))
|
||||||
|
}
|
||||||
|
|
||||||
|
e.done = true
|
||||||
|
|
||||||
|
return e.fn(ii)
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestMapReduce(t *testing.T) {
|
||||||
|
|
||||||
|
type (
|
||||||
|
Va = I
|
||||||
|
Vb = int
|
||||||
|
Ea = *mapReduceTestEdge
|
||||||
|
edge = OpenEdge[Ea, Va]
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
zeroVb Vb
|
||||||
|
)
|
||||||
|
|
||||||
|
vOut := func(edge Ea, val Va) *edge {
|
||||||
|
return ValueOut[Ea, Va](edge, val)
|
||||||
|
}
|
||||||
|
|
||||||
|
tOut := func(edge Ea, ins ...*edge) *edge {
|
||||||
|
return TupleOut[Ea, Va](edge, ins...)
|
||||||
|
}
|
||||||
|
|
||||||
|
add := func() *mapReduceTestEdge {
|
||||||
|
return &mapReduceTestEdge{
|
||||||
|
name: "add",
|
||||||
|
fn: func(ii []int) int {
|
||||||
|
var n int
|
||||||
|
for _, i := range ii {
|
||||||
|
n += i
|
||||||
|
}
|
||||||
|
return n
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
mul := func() *mapReduceTestEdge {
|
||||||
|
return &mapReduceTestEdge{
|
||||||
|
name: "mul",
|
||||||
|
fn: func(ii []int) int {
|
||||||
|
n := 1
|
||||||
|
for _, i := range ii {
|
||||||
|
n *= i
|
||||||
|
}
|
||||||
|
return n
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
mapVal := func(valA Va) (Vb, error) {
|
||||||
|
return Vb(valA * 10), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
reduceEdge := func(edgeA Ea, valBs []Vb) (Vb, error) {
|
||||||
|
|
||||||
|
if edgeA == nil {
|
||||||
|
|
||||||
|
if len(valBs) == 1 {
|
||||||
|
return valBs[0], nil
|
||||||
|
}
|
||||||
|
|
||||||
|
return zeroVb, errors.New("tuple edge must have edge value")
|
||||||
|
}
|
||||||
|
|
||||||
|
return edgeA.do(valBs), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
tests := []struct {
|
||||||
|
in *edge
|
||||||
|
exp int
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
in: vOut(nil, 1),
|
||||||
|
exp: 10,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
in: vOut(add(), 1),
|
||||||
|
exp: 10,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
in: tOut(
|
||||||
|
add(),
|
||||||
|
vOut(nil, 1),
|
||||||
|
vOut(add(), 2),
|
||||||
|
vOut(mul(), 3),
|
||||||
|
),
|
||||||
|
exp: 60,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
// duplicate edges and values getting used twice, each should only
|
||||||
|
// get eval'd once
|
||||||
|
in: tOut(
|
||||||
|
add(),
|
||||||
|
tOut(add(), vOut(nil, 1), vOut(nil, 2)),
|
||||||
|
tOut(add(), vOut(nil, 1), vOut(nil, 2)),
|
||||||
|
tOut(add(), vOut(nil, 3), vOut(nil, 3)),
|
||||||
|
),
|
||||||
|
exp: 120,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for i, test := range tests {
|
||||||
|
t.Run(strconv.Itoa(i), func(t *testing.T) {
|
||||||
|
got, err := MapReduce(test.in, mapVal, reduceEdge)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
assert.Equal(t, test.exp, got)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
118
lang/lang.go
118
lang/lang.go
@ -1,118 +0,0 @@
|
|||||||
package lang
|
|
||||||
|
|
||||||
import (
|
|
||||||
"fmt"
|
|
||||||
"reflect"
|
|
||||||
"strings"
|
|
||||||
)
|
|
||||||
|
|
||||||
// Commonly used Terms
|
|
||||||
var (
|
|
||||||
// Language structure types
|
|
||||||
AAtom = Atom("atom")
|
|
||||||
AConst = Atom("const")
|
|
||||||
ATuple = Atom("tup")
|
|
||||||
AList = Atom("list")
|
|
||||||
|
|
||||||
// Match shortcuts
|
|
||||||
AUnder = Atom("_")
|
|
||||||
TDblUnder = Tuple{AUnder, AUnder}
|
|
||||||
)
|
|
||||||
|
|
||||||
// Term is a unit of language which carries some meaning. Some Terms are
|
|
||||||
// actually comprised of multiple sub-Terms.
|
|
||||||
type Term interface {
|
|
||||||
fmt.Stringer // for debugging
|
|
||||||
|
|
||||||
// Type returns a Term which describes the type of this Term, i.e. the
|
|
||||||
// components this Term is comprised of.
|
|
||||||
Type() Term
|
|
||||||
}
|
|
||||||
|
|
||||||
// Equal returns whether or not two Terms are of equal value
|
|
||||||
func Equal(t1, t2 Term) bool {
|
|
||||||
return reflect.DeepEqual(t1, t2)
|
|
||||||
}
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
|
|
||||||
// Atom is a constant with no other meaning than that it can be equal or not
|
|
||||||
// equal to another Atom.
|
|
||||||
type Atom string
|
|
||||||
|
|
||||||
func (a Atom) String() string {
|
|
||||||
return string(a)
|
|
||||||
}
|
|
||||||
|
|
||||||
// Type implements the method for Term
|
|
||||||
func (a Atom) Type() Term {
|
|
||||||
return AAtom
|
|
||||||
}
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
|
|
||||||
// Const is a constant whose meaning depends on the context in which it is used
|
|
||||||
type Const string
|
|
||||||
|
|
||||||
func (a Const) String() string {
|
|
||||||
return string(a)
|
|
||||||
}
|
|
||||||
|
|
||||||
// Type implements the method for Term
|
|
||||||
func (a Const) Type() Term {
|
|
||||||
return AConst
|
|
||||||
}
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
|
|
||||||
// Tuple is a compound Term of zero or more sub-Terms, each of which may have a
|
|
||||||
// different Type. Both the length of the Tuple and the Type of each of it's
|
|
||||||
// sub-Terms are components in the Tuple's Type.
|
|
||||||
type Tuple []Term
|
|
||||||
|
|
||||||
func (t Tuple) String() string {
|
|
||||||
ss := make([]string, len(t))
|
|
||||||
for i := range t {
|
|
||||||
ss[i] = t[i].String()
|
|
||||||
}
|
|
||||||
return "(" + strings.Join(ss, " ") + ")"
|
|
||||||
}
|
|
||||||
|
|
||||||
// Type implements the method for Term
|
|
||||||
func (t Tuple) Type() Term {
|
|
||||||
tt := make(Tuple, len(t))
|
|
||||||
for i := range t {
|
|
||||||
tt[i] = t[i].Type()
|
|
||||||
}
|
|
||||||
return Tuple{ATuple, tt}
|
|
||||||
}
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
|
|
||||||
type list struct {
|
|
||||||
typ Term
|
|
||||||
ll []Term
|
|
||||||
}
|
|
||||||
|
|
||||||
// List is a compound Term of zero or more sub-Terms, each of which must have
|
|
||||||
// the same Type (the one given as the first argument to this function). Only
|
|
||||||
// the Type of the sub-Terms is a component in the List's Type.
|
|
||||||
func List(typ Term, elems ...Term) Term {
|
|
||||||
return list{
|
|
||||||
typ: typ,
|
|
||||||
ll: elems,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func (l list) String() string {
|
|
||||||
ss := make([]string, len(l.ll))
|
|
||||||
for i := range l.ll {
|
|
||||||
ss[i] = l.ll[i].String()
|
|
||||||
}
|
|
||||||
return "[" + strings.Join(ss, " ") + "]"
|
|
||||||
}
|
|
||||||
|
|
||||||
// Type implements the method for Term
|
|
||||||
func (l list) Type() Term {
|
|
||||||
return Tuple{AList, l.typ}
|
|
||||||
}
|
|
@ -1,54 +0,0 @@
|
|||||||
package lang
|
|
||||||
|
|
||||||
import "fmt"
|
|
||||||
|
|
||||||
// Match is used to pattern match an arbitrary Term against a pattern. A pattern
|
|
||||||
// is a 2-tuple of the type (as an atom, e.g. AAtom, AConst) and a matching
|
|
||||||
// value.
|
|
||||||
//
|
|
||||||
// If the value is AUnder the pattern will match all Terms of the type,
|
|
||||||
// regardless of their value. If the pattern's type and value are both AUnder
|
|
||||||
// the pattern will match all Terms.
|
|
||||||
//
|
|
||||||
// If the pattern's value is a Tuple or a List, each of its values will be used
|
|
||||||
// as a sub-pattern to match against the corresponding value in the value.
|
|
||||||
func Match(pat Tuple, t Term) bool {
|
|
||||||
if len(pat) != 2 {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
pt, pv := pat[0], pat[1]
|
|
||||||
|
|
||||||
switch pt {
|
|
||||||
case AAtom:
|
|
||||||
a, ok := t.(Atom)
|
|
||||||
return ok && (Equal(pv, AUnder) || Equal(pv, a))
|
|
||||||
case AConst:
|
|
||||||
c, ok := t.(Const)
|
|
||||||
return ok && (Equal(pv, AUnder) || Equal(pv, c))
|
|
||||||
case ATuple:
|
|
||||||
tt, ok := t.(Tuple)
|
|
||||||
if !ok {
|
|
||||||
return false
|
|
||||||
} else if Equal(pv, AUnder) {
|
|
||||||
return true
|
|
||||||
}
|
|
||||||
|
|
||||||
pvt := pv.(Tuple)
|
|
||||||
if len(tt) != len(pvt) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
for i := range tt {
|
|
||||||
pvti, ok := pvt[i].(Tuple)
|
|
||||||
if !ok || !Match(pvti, tt[i]) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return true
|
|
||||||
case AList:
|
|
||||||
panic("TODO")
|
|
||||||
case AUnder:
|
|
||||||
return true
|
|
||||||
default:
|
|
||||||
panic(fmt.Sprintf("unknown type %T", pt))
|
|
||||||
}
|
|
||||||
}
|
|
@ -1,66 +0,0 @@
|
|||||||
package lang
|
|
||||||
|
|
||||||
import (
|
|
||||||
. "testing"
|
|
||||||
|
|
||||||
"github.com/stretchr/testify/assert"
|
|
||||||
)
|
|
||||||
|
|
||||||
func TestMatch(t *T) {
|
|
||||||
pat := func(typ, val Term) Tuple {
|
|
||||||
return Tuple{typ, val}
|
|
||||||
}
|
|
||||||
|
|
||||||
tests := []struct {
|
|
||||||
pattern Tuple
|
|
||||||
t Term
|
|
||||||
exp bool
|
|
||||||
}{
|
|
||||||
{pat(AAtom, Atom("foo")), Atom("foo"), true},
|
|
||||||
{pat(AAtom, Atom("foo")), Atom("bar"), false},
|
|
||||||
{pat(AAtom, Atom("foo")), Const("foo"), false},
|
|
||||||
{pat(AAtom, Atom("foo")), Tuple{Atom("a"), Atom("b")}, false},
|
|
||||||
{pat(AAtom, Atom("_")), Atom("bar"), true},
|
|
||||||
{pat(AAtom, Atom("_")), Const("bar"), false},
|
|
||||||
|
|
||||||
{pat(AConst, Const("foo")), Const("foo"), true},
|
|
||||||
{pat(AConst, Const("foo")), Atom("foo"), false},
|
|
||||||
{pat(AConst, Const("foo")), Const("bar"), false},
|
|
||||||
{pat(AConst, Atom("_")), Const("bar"), true},
|
|
||||||
{pat(AConst, Atom("_")), Atom("foo"), false},
|
|
||||||
|
|
||||||
{
|
|
||||||
pat(ATuple, Tuple{
|
|
||||||
pat(AAtom, Atom("foo")),
|
|
||||||
pat(AAtom, Atom("bar")),
|
|
||||||
}),
|
|
||||||
Tuple{Atom("foo"), Atom("bar")},
|
|
||||||
true,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
pat(ATuple, Tuple{
|
|
||||||
pat(AAtom, Atom("_")),
|
|
||||||
pat(AAtom, Atom("bar")),
|
|
||||||
}),
|
|
||||||
Tuple{Atom("foo"), Atom("bar")},
|
|
||||||
true,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
pat(ATuple, Tuple{
|
|
||||||
pat(AAtom, Atom("_")),
|
|
||||||
pat(AAtom, Atom("_")),
|
|
||||||
pat(AAtom, Atom("_")),
|
|
||||||
}),
|
|
||||||
Tuple{Atom("foo"), Atom("bar")},
|
|
||||||
false,
|
|
||||||
},
|
|
||||||
|
|
||||||
{pat(AUnder, AUnder), Atom("foo"), true},
|
|
||||||
{pat(AUnder, AUnder), Const("foo"), true},
|
|
||||||
{pat(AUnder, AUnder), Tuple{Atom("a"), Atom("b")}, true},
|
|
||||||
}
|
|
||||||
|
|
||||||
for _, testCase := range tests {
|
|
||||||
assert.Equal(t, testCase.exp, Match(testCase.pattern, testCase.t), "%#v", testCase)
|
|
||||||
}
|
|
||||||
}
|
|
349
lexer/lexer.go
349
lexer/lexer.go
@ -1,349 +0,0 @@
|
|||||||
package lexer
|
|
||||||
|
|
||||||
import (
|
|
||||||
"bufio"
|
|
||||||
"bytes"
|
|
||||||
"errors"
|
|
||||||
"fmt"
|
|
||||||
"io"
|
|
||||||
"strings"
|
|
||||||
)
|
|
||||||
|
|
||||||
// TokenType indicates the type of a token
|
|
||||||
type TokenType string
|
|
||||||
|
|
||||||
// Different token types
|
|
||||||
const (
|
|
||||||
Identifier TokenType = "identifier"
|
|
||||||
|
|
||||||
// Punctuation are tokens which connect two other tokens
|
|
||||||
Punctuation TokenType = "punctuation"
|
|
||||||
|
|
||||||
// Wrapper wraps one or more tokens
|
|
||||||
Wrapper TokenType = "wrapper"
|
|
||||||
String TokenType = "string"
|
|
||||||
Err TokenType = "err"
|
|
||||||
EOF TokenType = "eof"
|
|
||||||
)
|
|
||||||
|
|
||||||
// Token is a single token which has been read in. All Tokens have a non-empty
|
|
||||||
// Val
|
|
||||||
type Token struct {
|
|
||||||
TokenType
|
|
||||||
Val string
|
|
||||||
Row, Col int
|
|
||||||
}
|
|
||||||
|
|
||||||
// Equal returns whether two tokens are of equal type and value
|
|
||||||
func (tok Token) Equal(tok2 Token) bool {
|
|
||||||
return tok.TokenType == tok2.TokenType && tok.Val == tok2.Val
|
|
||||||
}
|
|
||||||
|
|
||||||
// Err returns the error contained by the token, if any. Only returns non-nil if
|
|
||||||
// TokenType is Err or EOF
|
|
||||||
func (tok Token) Err() error {
|
|
||||||
if tok.TokenType == Err || tok.TokenType == EOF {
|
|
||||||
return fmt.Errorf("[line:%d col:%d] %s", tok.Row, tok.Col, tok.Val)
|
|
||||||
}
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
|
|
||||||
func (tok Token) String() string {
|
|
||||||
var typ string
|
|
||||||
switch tok.TokenType {
|
|
||||||
case Identifier:
|
|
||||||
typ = "ident"
|
|
||||||
case Punctuation:
|
|
||||||
typ = "punct"
|
|
||||||
case String:
|
|
||||||
typ = "str"
|
|
||||||
case Err, EOF:
|
|
||||||
typ = "err"
|
|
||||||
}
|
|
||||||
return fmt.Sprintf("%s(%q)", typ, tok.Val)
|
|
||||||
}
|
|
||||||
|
|
||||||
type lexerFn func(*Lexer) lexerFn
|
|
||||||
|
|
||||||
// Lexer is used to read in ginger tokens from a source. HasNext() must be
|
|
||||||
// called before every call to Next()
|
|
||||||
type Lexer struct {
|
|
||||||
in *bufio.Reader
|
|
||||||
out *bytes.Buffer
|
|
||||||
cur lexerFn
|
|
||||||
|
|
||||||
next []Token
|
|
||||||
|
|
||||||
row, col int
|
|
||||||
absRow, absCol int
|
|
||||||
}
|
|
||||||
|
|
||||||
// New returns a Lexer which will read tokens from the given source.
|
|
||||||
func New(r io.Reader) *Lexer {
|
|
||||||
return &Lexer{
|
|
||||||
in: bufio.NewReader(r),
|
|
||||||
out: new(bytes.Buffer),
|
|
||||||
cur: lex,
|
|
||||||
|
|
||||||
row: -1,
|
|
||||||
col: -1,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func (l *Lexer) emit(t TokenType) {
|
|
||||||
str := l.out.String()
|
|
||||||
if str == "" {
|
|
||||||
panic("cannot emit empty token")
|
|
||||||
}
|
|
||||||
l.out.Reset()
|
|
||||||
|
|
||||||
l.emitTok(Token{
|
|
||||||
TokenType: t,
|
|
||||||
Val: str,
|
|
||||||
Row: l.row,
|
|
||||||
Col: l.col,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
func (l *Lexer) emitErr(err error) {
|
|
||||||
tok := Token{
|
|
||||||
TokenType: Err,
|
|
||||||
Val: err.Error(),
|
|
||||||
Row: l.absRow,
|
|
||||||
Col: l.absCol,
|
|
||||||
}
|
|
||||||
if err == io.EOF {
|
|
||||||
tok.TokenType = EOF
|
|
||||||
}
|
|
||||||
l.emitTok(tok)
|
|
||||||
}
|
|
||||||
|
|
||||||
func (l *Lexer) emitTok(tok Token) {
|
|
||||||
l.next = append(l.next, tok)
|
|
||||||
l.row = -1
|
|
||||||
l.col = -1
|
|
||||||
}
|
|
||||||
|
|
||||||
func (l *Lexer) readRune() (rune, error) {
|
|
||||||
r, _, err := l.in.ReadRune()
|
|
||||||
if err != nil {
|
|
||||||
return r, err
|
|
||||||
}
|
|
||||||
|
|
||||||
if r == '\n' {
|
|
||||||
l.absRow++
|
|
||||||
l.absCol = 0
|
|
||||||
} else {
|
|
||||||
l.absCol++
|
|
||||||
}
|
|
||||||
|
|
||||||
return r, err
|
|
||||||
}
|
|
||||||
|
|
||||||
func (l *Lexer) peekRune() (rune, error) {
|
|
||||||
r, _, err := l.in.ReadRune()
|
|
||||||
if err != nil {
|
|
||||||
return r, err
|
|
||||||
}
|
|
||||||
|
|
||||||
if err := l.in.UnreadRune(); err != nil {
|
|
||||||
return r, err
|
|
||||||
}
|
|
||||||
return r, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
func (l *Lexer) readAndPeek() (rune, rune, error) {
|
|
||||||
r, err := l.readRune()
|
|
||||||
if err != nil {
|
|
||||||
return r, 0, err
|
|
||||||
}
|
|
||||||
|
|
||||||
n, err := l.peekRune()
|
|
||||||
return r, n, err
|
|
||||||
}
|
|
||||||
|
|
||||||
func (l *Lexer) bufferRune(r rune) {
|
|
||||||
l.out.WriteRune(r)
|
|
||||||
if l.row < 0 && l.col < 0 {
|
|
||||||
l.row, l.col = l.absRow, l.absCol
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// HasNext returns true if Next should be called, and false if it should not be
|
|
||||||
// called and Err should be called instead. When HasNext returns false the Lexer
|
|
||||||
// is considered to be done
|
|
||||||
func (l *Lexer) HasNext() bool {
|
|
||||||
for {
|
|
||||||
if len(l.next) > 0 {
|
|
||||||
return true
|
|
||||||
} else if l.cur == nil {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
l.cur = l.cur(l)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Next returns the next available token. HasNext must be called before every
|
|
||||||
// call to Next
|
|
||||||
func (l *Lexer) Next() Token {
|
|
||||||
t := l.next[0]
|
|
||||||
l.next = l.next[1:]
|
|
||||||
if len(l.next) == 0 {
|
|
||||||
l.next = nil
|
|
||||||
}
|
|
||||||
return t
|
|
||||||
}
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
// the actual fsm
|
|
||||||
|
|
||||||
var whitespaceSet = " \n\r\t\v\f"
|
|
||||||
var punctuationSet = ",>"
|
|
||||||
var wrapperSet = "{}()"
|
|
||||||
var identifierSepSet = whitespaceSet + punctuationSet + wrapperSet
|
|
||||||
|
|
||||||
func lex(l *Lexer) lexerFn {
|
|
||||||
r, err := l.readRune()
|
|
||||||
if err != nil {
|
|
||||||
l.emitErr(err)
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// handle comments first, cause we have to peek for those. We ignore errors,
|
|
||||||
// and assume that any error that would happen here will happen again the
|
|
||||||
// next read
|
|
||||||
if n, _ := l.peekRune(); r == '/' && n == '/' {
|
|
||||||
return lexLineComment
|
|
||||||
} else if r == '/' && n == '*' {
|
|
||||||
return lexBlockComment
|
|
||||||
}
|
|
||||||
|
|
||||||
return lexSingleRune(l, r)
|
|
||||||
}
|
|
||||||
|
|
||||||
func lexSingleRune(l *Lexer, r rune) lexerFn {
|
|
||||||
switch {
|
|
||||||
case strings.ContainsRune(whitespaceSet, r):
|
|
||||||
return lex
|
|
||||||
case strings.ContainsRune(punctuationSet, r):
|
|
||||||
l.bufferRune(r)
|
|
||||||
l.emit(Punctuation)
|
|
||||||
return lex
|
|
||||||
case strings.ContainsRune(wrapperSet, r):
|
|
||||||
l.bufferRune(r)
|
|
||||||
l.emit(Wrapper)
|
|
||||||
return lex
|
|
||||||
case r == '"' || r == '\'' || r == '`':
|
|
||||||
canEscape := r != '`'
|
|
||||||
return lexStrStart(l, r, makeLexStr(r, canEscape))
|
|
||||||
default:
|
|
||||||
l.bufferRune(r)
|
|
||||||
return lexIdentifier
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func lexIdentifier(l *Lexer) lexerFn {
|
|
||||||
r, err := l.readRune()
|
|
||||||
if err != nil {
|
|
||||||
l.emit(Identifier)
|
|
||||||
l.emitErr(err)
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
|
|
||||||
if strings.ContainsRune(identifierSepSet, r) {
|
|
||||||
l.emit(Identifier)
|
|
||||||
return lexSingleRune(l, r)
|
|
||||||
}
|
|
||||||
|
|
||||||
l.bufferRune(r)
|
|
||||||
|
|
||||||
return lexIdentifier
|
|
||||||
}
|
|
||||||
|
|
||||||
func lexLineComment(l *Lexer) lexerFn {
|
|
||||||
r, err := l.readRune()
|
|
||||||
if err != nil {
|
|
||||||
l.emitErr(err)
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
if r == '\n' {
|
|
||||||
return lex
|
|
||||||
}
|
|
||||||
return lexLineComment
|
|
||||||
}
|
|
||||||
|
|
||||||
// assumes the starting / has been read already
|
|
||||||
func lexBlockComment(l *Lexer) lexerFn {
|
|
||||||
depth := 1
|
|
||||||
|
|
||||||
var recurse lexerFn
|
|
||||||
recurse = func(l *Lexer) lexerFn {
|
|
||||||
r, err := l.readRune()
|
|
||||||
if err != nil {
|
|
||||||
l.emitErr(err)
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
n, _ := l.peekRune()
|
|
||||||
|
|
||||||
if r == '/' && n == '*' {
|
|
||||||
depth++
|
|
||||||
} else if r == '*' && n == '/' {
|
|
||||||
depth--
|
|
||||||
}
|
|
||||||
|
|
||||||
if depth == 0 {
|
|
||||||
return lexSkipThen(lex)
|
|
||||||
}
|
|
||||||
return recurse
|
|
||||||
}
|
|
||||||
return recurse
|
|
||||||
}
|
|
||||||
|
|
||||||
func lexStrStart(lexer *Lexer, r rune, then lexerFn) lexerFn {
|
|
||||||
lexer.bufferRune(r)
|
|
||||||
return then
|
|
||||||
}
|
|
||||||
|
|
||||||
func makeLexStr(quoteC rune, canEscape bool) lexerFn {
|
|
||||||
var fn lexerFn
|
|
||||||
fn = func(l *Lexer) lexerFn {
|
|
||||||
r, n, err := l.readAndPeek()
|
|
||||||
if err != nil {
|
|
||||||
if err == io.EOF {
|
|
||||||
if r == quoteC {
|
|
||||||
l.bufferRune(r)
|
|
||||||
l.emit(String)
|
|
||||||
l.emitErr(err)
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
l.emitErr(errors.New("expected end of string, got end of file"))
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if canEscape && r == '\\' && n == quoteC {
|
|
||||||
l.bufferRune(r)
|
|
||||||
l.bufferRune(n)
|
|
||||||
return lexSkipThen(fn)
|
|
||||||
}
|
|
||||||
|
|
||||||
l.bufferRune(r)
|
|
||||||
if r == quoteC {
|
|
||||||
l.emit(String)
|
|
||||||
return lex
|
|
||||||
}
|
|
||||||
|
|
||||||
return fn
|
|
||||||
}
|
|
||||||
return fn
|
|
||||||
}
|
|
||||||
|
|
||||||
func lexSkipThen(then lexerFn) lexerFn {
|
|
||||||
return func(l *Lexer) lexerFn {
|
|
||||||
if _, err := l.readRune(); err != nil {
|
|
||||||
l.emitErr(err)
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
return then
|
|
||||||
}
|
|
||||||
}
|
|
@ -1,82 +0,0 @@
|
|||||||
package lexer
|
|
||||||
|
|
||||||
import (
|
|
||||||
"bytes"
|
|
||||||
. "testing"
|
|
||||||
|
|
||||||
"github.com/stretchr/testify/assert"
|
|
||||||
"github.com/stretchr/testify/require"
|
|
||||||
)
|
|
||||||
|
|
||||||
var lexTestSrc = `
|
|
||||||
// this is a comment
|
|
||||||
// // this is also a comment
|
|
||||||
a
|
|
||||||
anIdentifier
|
|
||||||
1
|
|
||||||
100
|
|
||||||
1.5
|
|
||||||
1.5e9
|
|
||||||
|
|
||||||
/*
|
|
||||||
some stuff
|
|
||||||
*/
|
|
||||||
|
|
||||||
/* this should actually work */
|
|
||||||
/*/
|
|
||||||
|
|
||||||
/*
|
|
||||||
nested!
|
|
||||||
/*
|
|
||||||
wtf this is crazy
|
|
||||||
*/
|
|
||||||
*/
|
|
||||||
|
|
||||||
(punctuation,is{cool}> )
|
|
||||||
-tab
|
|
||||||
|
|
||||||
"this is a string", "and so is this one"
|
|
||||||
"\"foo"
|
|
||||||
"bar\"baz\""
|
|
||||||
"buz\0"
|
|
||||||
`
|
|
||||||
|
|
||||||
func TestLex(t *T) {
|
|
||||||
l := New(bytes.NewBufferString(lexTestSrc))
|
|
||||||
|
|
||||||
assertNext := func(typ TokenType, val string, row, col int) {
|
|
||||||
t.Logf("asserting %s %q [row:%d col:%d]", typ, val, row, col)
|
|
||||||
require.True(t, l.HasNext())
|
|
||||||
tok := l.Next()
|
|
||||||
assert.Equal(t, typ, tok.TokenType)
|
|
||||||
assert.Equal(t, val, tok.Val)
|
|
||||||
assert.Equal(t, row, tok.Row)
|
|
||||||
assert.Equal(t, col, tok.Col)
|
|
||||||
}
|
|
||||||
|
|
||||||
assertNext(Identifier, "a", 3, 2)
|
|
||||||
assertNext(Identifier, "anIdentifier", 4, 2)
|
|
||||||
assertNext(Identifier, "1", 5, 2)
|
|
||||||
assertNext(Identifier, "100", 6, 2)
|
|
||||||
assertNext(Identifier, "1.5", 7, 2)
|
|
||||||
assertNext(Identifier, "1.5e9", 8, 2)
|
|
||||||
assertNext(Wrapper, "(", 24, 2)
|
|
||||||
assertNext(Identifier, "punctuation", 24, 3)
|
|
||||||
assertNext(Punctuation, ",", 24, 14)
|
|
||||||
assertNext(Identifier, "is", 24, 15)
|
|
||||||
assertNext(Wrapper, "{", 24, 17)
|
|
||||||
assertNext(Identifier, "cool", 24, 18)
|
|
||||||
assertNext(Wrapper, "}", 24, 22)
|
|
||||||
assertNext(Punctuation, ">", 24, 23)
|
|
||||||
assertNext(Wrapper, ")", 24, 25)
|
|
||||||
assertNext(Identifier, "-tab", 25, 2)
|
|
||||||
assertNext(String, `"this is a string"`, 27, 2)
|
|
||||||
assertNext(Punctuation, ",", 27, 20)
|
|
||||||
assertNext(String, `"and so is this one"`, 27, 22)
|
|
||||||
assertNext(String, `"\"foo"`, 28, 2)
|
|
||||||
assertNext(String, `"bar\"baz\""`, 29, 2)
|
|
||||||
assertNext(String, `"buz\0"`, 30, 2)
|
|
||||||
assertNext(EOF, "EOF", 31, 0)
|
|
||||||
|
|
||||||
assert.False(t, l.HasNext())
|
|
||||||
}
|
|
47
main.go
47
main.go
@ -1,47 +0,0 @@
|
|||||||
package main
|
|
||||||
|
|
||||||
import (
|
|
||||||
"fmt"
|
|
||||||
|
|
||||||
"github.com/mediocregopher/ginger/lang"
|
|
||||||
"github.com/mediocregopher/ginger/vm"
|
|
||||||
)
|
|
||||||
|
|
||||||
func main() {
|
|
||||||
mkcmd := func(a lang.Atom, args ...lang.Term) lang.Tuple {
|
|
||||||
if len(args) == 1 {
|
|
||||||
return lang.Tuple{a, args[0]}
|
|
||||||
}
|
|
||||||
return lang.Tuple{a, lang.Tuple(args)}
|
|
||||||
}
|
|
||||||
mkint := func(i string) lang.Tuple {
|
|
||||||
return lang.Tuple{vm.Int, lang.Const(i)}
|
|
||||||
}
|
|
||||||
|
|
||||||
//foo := lang.Atom("foo")
|
|
||||||
//tt := []lang.Term{
|
|
||||||
// mkcmd(vm.Assign, foo, mkint("1")),
|
|
||||||
// mkcmd(vm.Add, mkcmd(vm.Tuple, mkcmd(vm.Var, foo), mkint("2"))),
|
|
||||||
//}
|
|
||||||
|
|
||||||
foo := lang.Atom("foo")
|
|
||||||
bar := lang.Atom("bar")
|
|
||||||
baz := lang.Atom("baz")
|
|
||||||
tt := []lang.Term{
|
|
||||||
mkcmd(vm.Assign, foo, mkcmd(vm.Tuple, mkint("1"), mkint("2"))),
|
|
||||||
mkcmd(vm.Assign, bar, mkcmd(vm.Add, mkcmd(vm.Var, foo))),
|
|
||||||
mkcmd(vm.Assign, baz, mkcmd(vm.Add, mkcmd(vm.Var, foo))),
|
|
||||||
mkcmd(vm.Add, mkcmd(vm.Tuple, mkcmd(vm.Var, bar), mkcmd(vm.Var, baz))),
|
|
||||||
}
|
|
||||||
|
|
||||||
mod, err := vm.Build(tt...)
|
|
||||||
if err != nil {
|
|
||||||
panic(err)
|
|
||||||
}
|
|
||||||
defer mod.Dispose()
|
|
||||||
|
|
||||||
mod.Dump()
|
|
||||||
|
|
||||||
out, err := mod.Run()
|
|
||||||
fmt.Printf("\n\n########\nout: %v %v\n", out, err)
|
|
||||||
}
|
|
@ -1,39 +0,0 @@
|
|||||||
package list
|
|
||||||
|
|
||||||
import "fmt"
|
|
||||||
|
|
||||||
/*
|
|
||||||
+ size isn't really _necessary_ unless O(1) Len is wanted
|
|
||||||
+ append doesn't work well on stack
|
|
||||||
*/
|
|
||||||
|
|
||||||
type List struct {
|
|
||||||
// in practice this would be a constant size, with the compiler knowing the
|
|
||||||
// size
|
|
||||||
underlying []int
|
|
||||||
head, size int
|
|
||||||
}
|
|
||||||
|
|
||||||
func New(ii ...int) List {
|
|
||||||
l := List{
|
|
||||||
underlying: make([]int, ii),
|
|
||||||
size: len(ii),
|
|
||||||
}
|
|
||||||
copy(l.underlying, ii)
|
|
||||||
return l
|
|
||||||
}
|
|
||||||
|
|
||||||
func (l List) Len() int {
|
|
||||||
return l.size
|
|
||||||
}
|
|
||||||
|
|
||||||
func (l List) HeadTail() (int, List) {
|
|
||||||
if l.size == 0 {
|
|
||||||
panic(fmt.Sprintf("can't take HeadTail of empty list"))
|
|
||||||
}
|
|
||||||
return l.underlying[l.head], List{
|
|
||||||
underlying: l.underlying,
|
|
||||||
head: l.head + 1,
|
|
||||||
size: l.size - 1,
|
|
||||||
}
|
|
||||||
}
|
|
108
sandbox/syntax.txt
Normal file
108
sandbox/syntax.txt
Normal file
@ -0,0 +1,108 @@
|
|||||||
|
# 2021/08/26
|
||||||
|
#
|
||||||
|
# output of godoc on gg is this:
|
||||||
|
|
||||||
|
var ZeroGraph = &Graph{ ... }
|
||||||
|
func Equal(g1, g2 *Graph) bool
|
||||||
|
type Edge struct{ ... }
|
||||||
|
type Graph struct{ ... }
|
||||||
|
type OpenEdge struct{ ... }
|
||||||
|
func TupleOut(ins []OpenEdge, edgeVal Value) OpenEdge
|
||||||
|
func ValueOut(val, edgeVal Value) OpenEdge
|
||||||
|
type Value struct{ ... }
|
||||||
|
func NewValue(V interface{}) Value
|
||||||
|
type Vertex struct{ ... }
|
||||||
|
type VertexType string
|
||||||
|
const ValueVertex VertexType = "value" ...
|
||||||
|
|
||||||
|
We just need to formulate a syntax which describes these operations and
|
||||||
|
entities.
|
||||||
|
|
||||||
|
Based on an old note I found in this file it seems like it reads
|
||||||
|
better to actually order everything "backwards" in the syntax, so I'm going to
|
||||||
|
go with that. I left the note at the bottom, commented out
|
||||||
|
|
||||||
|
-(<openEdge>,...) TupleOut (an openEdge)
|
||||||
|
-<edgeVal>-(<openEdge>,...) TupleOut with an edgeVal (an openEdge)
|
||||||
|
-<val> ValueOut (an openEdge)
|
||||||
|
-<edgeVal>-<val> ValueOut with an edgeVal (an openEdge)
|
||||||
|
{<val> <openEdge>, ...} ValueIn (a graph)
|
||||||
|
|
||||||
|
values can only be alphanumeric, or graphs.
|
||||||
|
TODO what to do about negative numbers? -1 is ambiguous
|
||||||
|
|
||||||
|
This means the below fibonnaci can be done using:
|
||||||
|
|
||||||
|
{
|
||||||
|
decr -{ out -sub-(-in, -1) }
|
||||||
|
|
||||||
|
out -{
|
||||||
|
n -0-in,
|
||||||
|
a -1-in,
|
||||||
|
b -2-in,
|
||||||
|
|
||||||
|
out -if-(
|
||||||
|
-zero?-n,
|
||||||
|
-a,
|
||||||
|
-recur-(
|
||||||
|
-decr-n,
|
||||||
|
-b,
|
||||||
|
-add-(-a,-b)
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}-(-in, -0, -1)
|
||||||
|
}
|
||||||
|
|
||||||
|
###
|
||||||
|
|
||||||
|
Let's try to get rid of all the ugly prefix dashes (and maybe solve the -1
|
||||||
|
question). We ditch the dashes all-together; TupleOut with an edgeVal can be
|
||||||
|
done by just joining the two, and ValueOut with an edgeVal we can just make look
|
||||||
|
like a TupleOut with a single openEdge (which... it kind of is anyway).
|
||||||
|
|
||||||
|
(<openEdge>,...) TupleOut (an openEdge)
|
||||||
|
<edgeVal>(<openEdge>,...) TupleOut with an edgeVal (an openEdge)
|
||||||
|
<val> ValueOut (an openEdge)
|
||||||
|
<edgeVal>(<val>) ValueOut with an edgeVal (an openEdge)
|
||||||
|
{<val> <openEdge>[, ...]} ValueIn (a graph)
|
||||||
|
|
||||||
|
values can only be alphanumeric, or graphs.
|
||||||
|
|
||||||
|
```
|
||||||
|
{
|
||||||
|
decr { out add(in, -1) }
|
||||||
|
|
||||||
|
out {
|
||||||
|
n 0(in),
|
||||||
|
a 1(in),
|
||||||
|
b 2(in),
|
||||||
|
|
||||||
|
out if(
|
||||||
|
zero?(n),
|
||||||
|
a,
|
||||||
|
recur(decr(n), b, add(a,b))
|
||||||
|
)
|
||||||
|
|
||||||
|
}(in, 0, 1)
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
################
|
||||||
|
# The Old Note #
|
||||||
|
################
|
||||||
|
#
|
||||||
|
#decr- add- |- in
|
||||||
|
# |- (-1)
|
||||||
|
#
|
||||||
|
#fib- (
|
||||||
|
# fibInner- (
|
||||||
|
# {n, a, b}- in
|
||||||
|
# out- if- |- zero?- n
|
||||||
|
# |- a
|
||||||
|
# |- fibInner- |- decr- n
|
||||||
|
# |- b
|
||||||
|
# |- add- {a,b}
|
||||||
|
# )
|
||||||
|
#)
|
||||||
|
#
|
||||||
|
#out- fib- atoi- first- in
|
280
vm/cmds.go
280
vm/cmds.go
@ -1,280 +0,0 @@
|
|||||||
package vm
|
|
||||||
|
|
||||||
import (
|
|
||||||
"errors"
|
|
||||||
"fmt"
|
|
||||||
"strconv"
|
|
||||||
|
|
||||||
"github.com/mediocregopher/ginger/lang"
|
|
||||||
"llvm.org/llvm/bindings/go/llvm"
|
|
||||||
)
|
|
||||||
|
|
||||||
type op interface {
|
|
||||||
inType() valType
|
|
||||||
outType() valType
|
|
||||||
build(*Module) (llvm.Value, error)
|
|
||||||
}
|
|
||||||
|
|
||||||
type valType struct {
|
|
||||||
term lang.Term
|
|
||||||
llvm llvm.Type
|
|
||||||
}
|
|
||||||
|
|
||||||
func (vt valType) isInt() bool {
|
|
||||||
return lang.Equal(Int, vt.term)
|
|
||||||
}
|
|
||||||
|
|
||||||
func (vt valType) eq(vt2 valType) bool {
|
|
||||||
return lang.Equal(vt.term, vt2.term) && vt.llvm == vt2.llvm
|
|
||||||
}
|
|
||||||
|
|
||||||
// primitive valTypes
|
|
||||||
var (
|
|
||||||
valTypeVoid = valType{term: lang.Tuple{}, llvm: llvm.VoidType()}
|
|
||||||
valTypeInt = valType{term: Int, llvm: llvm.Int64Type()}
|
|
||||||
)
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
|
|
||||||
// most types don't have an input, so we use this as a shortcut
|
|
||||||
type voidIn struct{}
|
|
||||||
|
|
||||||
func (voidIn) inType() valType {
|
|
||||||
return valTypeVoid
|
|
||||||
}
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
|
|
||||||
type intOp struct {
|
|
||||||
voidIn
|
|
||||||
c lang.Const
|
|
||||||
}
|
|
||||||
|
|
||||||
func (io intOp) outType() valType {
|
|
||||||
return valTypeInt
|
|
||||||
}
|
|
||||||
|
|
||||||
func (io intOp) build(mod *Module) (llvm.Value, error) {
|
|
||||||
ci, err := strconv.ParseInt(string(io.c), 10, 64)
|
|
||||||
if err != nil {
|
|
||||||
return llvm.Value{}, err
|
|
||||||
}
|
|
||||||
return llvm.ConstInt(llvm.Int64Type(), uint64(ci), false), nil
|
|
||||||
}
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
|
|
||||||
type tupOp struct {
|
|
||||||
voidIn
|
|
||||||
els []op
|
|
||||||
}
|
|
||||||
|
|
||||||
func (to tupOp) outType() valType {
|
|
||||||
termTypes := make(lang.Tuple, len(to.els))
|
|
||||||
llvmTypes := make([]llvm.Type, len(to.els))
|
|
||||||
for i := range to.els {
|
|
||||||
elValType := to.els[i].outType()
|
|
||||||
termTypes[i] = elValType.term
|
|
||||||
llvmTypes[i] = elValType.llvm
|
|
||||||
}
|
|
||||||
vt := valType{term: lang.Tuple{Tuple, termTypes}}
|
|
||||||
if len(llvmTypes) == 0 {
|
|
||||||
vt.llvm = llvm.VoidType()
|
|
||||||
} else {
|
|
||||||
vt.llvm = llvm.StructType(llvmTypes, false)
|
|
||||||
}
|
|
||||||
return vt
|
|
||||||
}
|
|
||||||
|
|
||||||
func (to tupOp) build(mod *Module) (llvm.Value, error) {
|
|
||||||
str := llvm.Undef(to.outType().llvm)
|
|
||||||
var val llvm.Value
|
|
||||||
var err error
|
|
||||||
for i := range to.els {
|
|
||||||
if val, err = to.els[i].build(mod); err != nil {
|
|
||||||
return llvm.Value{}, err
|
|
||||||
}
|
|
||||||
str = mod.b.CreateInsertValue(str, val, i, "")
|
|
||||||
}
|
|
||||||
return str, err
|
|
||||||
}
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
|
|
||||||
type tupElOp struct {
|
|
||||||
voidIn
|
|
||||||
tup op
|
|
||||||
i int
|
|
||||||
}
|
|
||||||
|
|
||||||
func (teo tupElOp) outType() valType {
|
|
||||||
tupType := teo.tup.outType()
|
|
||||||
return valType{
|
|
||||||
llvm: tupType.llvm.StructElementTypes()[teo.i],
|
|
||||||
term: tupType.term.(lang.Tuple)[1].(lang.Tuple)[1],
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func (teo tupElOp) build(mod *Module) (llvm.Value, error) {
|
|
||||||
if to, ok := teo.tup.(tupOp); ok {
|
|
||||||
return to.els[teo.i].build(mod)
|
|
||||||
}
|
|
||||||
|
|
||||||
tv, err := teo.tup.build(mod)
|
|
||||||
if err != nil {
|
|
||||||
return llvm.Value{}, err
|
|
||||||
}
|
|
||||||
return mod.b.CreateExtractValue(tv, teo.i, ""), nil
|
|
||||||
}
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
|
|
||||||
type varOp struct {
|
|
||||||
op
|
|
||||||
v llvm.Value
|
|
||||||
built bool
|
|
||||||
}
|
|
||||||
|
|
||||||
func (vo *varOp) build(mod *Module) (llvm.Value, error) {
|
|
||||||
if !vo.built {
|
|
||||||
var err error
|
|
||||||
if vo.v, err = vo.op.build(mod); err != nil {
|
|
||||||
return llvm.Value{}, err
|
|
||||||
}
|
|
||||||
vo.built = true
|
|
||||||
}
|
|
||||||
return vo.v, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
type varCtx map[string]*varOp
|
|
||||||
|
|
||||||
func (c varCtx) assign(name string, vo *varOp) error {
|
|
||||||
if _, ok := c[name]; ok {
|
|
||||||
return fmt.Errorf("var %q already assigned", name)
|
|
||||||
}
|
|
||||||
c[name] = vo
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
|
|
||||||
func (c varCtx) get(name string) (*varOp, error) {
|
|
||||||
if o, ok := c[name]; ok {
|
|
||||||
return o, nil
|
|
||||||
}
|
|
||||||
return nil, fmt.Errorf("var %q referenced before assignment", name)
|
|
||||||
}
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
|
|
||||||
type addOp struct {
|
|
||||||
voidIn
|
|
||||||
a, b op
|
|
||||||
}
|
|
||||||
|
|
||||||
func (ao addOp) outType() valType {
|
|
||||||
return ao.a.outType()
|
|
||||||
}
|
|
||||||
|
|
||||||
func (ao addOp) build(mod *Module) (llvm.Value, error) {
|
|
||||||
av, err := ao.a.build(mod)
|
|
||||||
if err != nil {
|
|
||||||
return llvm.Value{}, err
|
|
||||||
}
|
|
||||||
bv, err := ao.b.build(mod)
|
|
||||||
if err != nil {
|
|
||||||
return llvm.Value{}, err
|
|
||||||
}
|
|
||||||
return mod.b.CreateAdd(av, bv, ""), nil
|
|
||||||
}
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
|
||||||
|
|
||||||
func termToOp(ctx varCtx, t lang.Term) (op, error) {
|
|
||||||
aPat := func(a lang.Atom) lang.Tuple {
|
|
||||||
return lang.Tuple{lang.AAtom, a}
|
|
||||||
}
|
|
||||||
cPat := func(t lang.Term) lang.Tuple {
|
|
||||||
return lang.Tuple{lang.AConst, t}
|
|
||||||
}
|
|
||||||
tPat := func(el ...lang.Term) lang.Tuple {
|
|
||||||
return lang.Tuple{Tuple, lang.Tuple(el)}
|
|
||||||
}
|
|
||||||
|
|
||||||
if !lang.Match(tPat(aPat(lang.AUnder), lang.TDblUnder), t) {
|
|
||||||
return nil, fmt.Errorf("term %v does not look like a vm command", t)
|
|
||||||
}
|
|
||||||
k := t.(lang.Tuple)[0].(lang.Atom)
|
|
||||||
v := t.(lang.Tuple)[1]
|
|
||||||
|
|
||||||
// for when v is a Tuple argument, convenience function for casting
|
|
||||||
vAsTup := func(n int) ([]op, error) {
|
|
||||||
vop, err := termToOp(ctx, v)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
ops := make([]op, n)
|
|
||||||
for i := range ops {
|
|
||||||
ops[i] = tupElOp{tup: vop, i: i}
|
|
||||||
}
|
|
||||||
|
|
||||||
return ops, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
switch k {
|
|
||||||
case Int:
|
|
||||||
if !lang.Match(cPat(lang.AUnder), v) {
|
|
||||||
return nil, errors.New("int requires constant arg")
|
|
||||||
}
|
|
||||||
return intOp{c: v.(lang.Const)}, nil
|
|
||||||
case Tuple:
|
|
||||||
if !lang.Match(lang.Tuple{Tuple, lang.AUnder}, v) {
|
|
||||||
return nil, errors.New("tup requires tuple arg")
|
|
||||||
}
|
|
||||||
tup := v.(lang.Tuple)
|
|
||||||
tc := tupOp{els: make([]op, len(tup))}
|
|
||||||
var err error
|
|
||||||
for i := range tup {
|
|
||||||
if tc.els[i], err = termToOp(ctx, tup[i]); err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return tc, nil
|
|
||||||
case Var:
|
|
||||||
if !lang.Match(aPat(lang.AUnder), v) {
|
|
||||||
return nil, errors.New("var requires atom arg")
|
|
||||||
}
|
|
||||||
name := v.(lang.Atom).String()
|
|
||||||
return ctx.get(name)
|
|
||||||
|
|
||||||
case Assign:
|
|
||||||
if !lang.Match(tPat(tPat(aPat(Var), aPat(lang.AUnder)), lang.TDblUnder), v) {
|
|
||||||
return nil, errors.New("assign requires 2-tuple arg, the first being a var")
|
|
||||||
}
|
|
||||||
tup := v.(lang.Tuple)
|
|
||||||
name := tup[0].(lang.Tuple)[1].String()
|
|
||||||
o, err := termToOp(ctx, tup[1])
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
vo := &varOp{op: o}
|
|
||||||
if err := ctx.assign(name, vo); err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
return vo, nil
|
|
||||||
|
|
||||||
// Add is special in some way, I think it's a function not a compiler op,
|
|
||||||
// not sure yet though
|
|
||||||
case Add:
|
|
||||||
els, err := vAsTup(2)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
} else if !els[0].outType().eq(valTypeInt) {
|
|
||||||
return nil, errors.New("add args must be numbers of the same type")
|
|
||||||
} else if !els[1].outType().eq(valTypeInt) {
|
|
||||||
return nil, errors.New("add args must be numbers of the same type")
|
|
||||||
}
|
|
||||||
return addOp{a: els[0], b: els[1]}, nil
|
|
||||||
default:
|
|
||||||
return nil, fmt.Errorf("op %v unknown, or its args are malformed", t)
|
|
||||||
}
|
|
||||||
}
|
|
246
vm/function.go
Normal file
246
vm/function.go
Normal file
@ -0,0 +1,246 @@
|
|||||||
|
package vm
|
||||||
|
|
||||||
|
import (
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
|
||||||
|
"code.betamike.com/mediocregopher/ginger/gg"
|
||||||
|
"code.betamike.com/mediocregopher/ginger/graph"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Function is an entity which accepts an argument Value and performs some
|
||||||
|
// internal processing on that argument to return a resultant Value.
|
||||||
|
type Function interface {
|
||||||
|
Perform(Value) Value
|
||||||
|
}
|
||||||
|
|
||||||
|
// FunctionFunc is a function which implements the Function interface.
|
||||||
|
type FunctionFunc func(Value) Value
|
||||||
|
|
||||||
|
// Perform calls the underlying FunctionFunc directly.
|
||||||
|
func (f FunctionFunc) Perform(arg Value) Value {
|
||||||
|
return f(arg)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Identity returns an Function which always returns the given Value,
|
||||||
|
// regardless of the input argument.
|
||||||
|
//
|
||||||
|
// TODO this might not be the right name
|
||||||
|
func Identity(val Value) Function {
|
||||||
|
return FunctionFunc(func(Value) Value {
|
||||||
|
return val
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
var (
|
||||||
|
valNameIn = Value{Value: gg.Name("!in")}
|
||||||
|
valNameOut = Value{Value: gg.Name("!out")}
|
||||||
|
valNameIf = Value{Value: gg.Name("!if")}
|
||||||
|
valNameRecur = Value{Value: gg.Name("!recur")}
|
||||||
|
valNumberZero = Value{Value: gg.Number(0)}
|
||||||
|
)
|
||||||
|
|
||||||
|
func checkGraphForFunction(g *gg.Graph) error {
|
||||||
|
for _, val := range g.AllValueIns() {
|
||||||
|
if val.Name == nil {
|
||||||
|
return fmt.Errorf("non-name %v cannot have incoming edges", val)
|
||||||
|
}
|
||||||
|
|
||||||
|
if !(Value{Value: val}).Equal(valNameOut) && (*val.Name)[0] == '!' {
|
||||||
|
return fmt.Errorf("name %v cannot start with a '!'", val)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// TODO check for acyclic-ness
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// FunctionFromGraph wraps the given Graph such that it can be used as an
|
||||||
|
// Function. The given Scope determines what values outside of the Graph are
|
||||||
|
// available for use within the Function.
|
||||||
|
func FunctionFromGraph(g *gg.Graph, scope Scope) (Function, error) {
|
||||||
|
|
||||||
|
if err := checkGraphForFunction(g); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// edgeFn is distinct from a generic Function in that the Value passed into
|
||||||
|
// Perform will _always_ be the value of "in" for the overall Function.
|
||||||
|
//
|
||||||
|
// edgeFns will wrap each other, passing "in" downwards to the leaf edgeFns.
|
||||||
|
type edgeFn Function
|
||||||
|
|
||||||
|
var compileEdge func(*gg.OpenEdge) (edgeFn, error)
|
||||||
|
|
||||||
|
// TODO memoize?
|
||||||
|
valToEdgeFn := func(val Value) (edgeFn, error) {
|
||||||
|
|
||||||
|
if val.Name == nil {
|
||||||
|
return edgeFn(Identity(val)), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
if val.Equal(valNameIn) {
|
||||||
|
return edgeFn(FunctionFunc(func(inArg Value) Value {
|
||||||
|
return inArg
|
||||||
|
})), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
edgesIn := g.ValueIns(val.Value)
|
||||||
|
|
||||||
|
name := *val.Name
|
||||||
|
|
||||||
|
if l := len(edgesIn); l == 0 {
|
||||||
|
resolvedVal, err := scope.Resolve(name)
|
||||||
|
if errors.Is(err, ErrNameNotDefined) {
|
||||||
|
return edgeFn(Identity(val)), nil
|
||||||
|
} else if err != nil {
|
||||||
|
return nil, fmt.Errorf("resolving name %q from the outer scope: %w", name, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return edgeFn(Identity(resolvedVal)), nil
|
||||||
|
|
||||||
|
} else if l != 1 {
|
||||||
|
return nil, fmt.Errorf("resolved name %q to %d input edges, rather than one", name, l)
|
||||||
|
}
|
||||||
|
|
||||||
|
edge := edgesIn[0]
|
||||||
|
|
||||||
|
return compileEdge(edge)
|
||||||
|
}
|
||||||
|
|
||||||
|
// "out" resolves to more than a static value, treat the graph as a full
|
||||||
|
// operation.
|
||||||
|
|
||||||
|
// thisFn is used to support recur. It will get filled in with the Function
|
||||||
|
// which is returned by this function, once that Function is created.
|
||||||
|
thisFn := new(Function)
|
||||||
|
|
||||||
|
compileEdge = func(edge *gg.OpenEdge) (edgeFn, error) {
|
||||||
|
|
||||||
|
return graph.MapReduce[gg.OptionalValue, gg.Value, edgeFn](
|
||||||
|
edge,
|
||||||
|
func(ggVal gg.Value) (edgeFn, error) {
|
||||||
|
return valToEdgeFn(Value{Value: ggVal})
|
||||||
|
},
|
||||||
|
func(ggEdgeVal gg.OptionalValue, inEdgeFns []edgeFn) (edgeFn, error) {
|
||||||
|
if ggEdgeVal.Equal(valNameIf.Value) {
|
||||||
|
|
||||||
|
if len(inEdgeFns) != 3 {
|
||||||
|
return nil, fmt.Errorf("'!if' requires a 3-tuple argument")
|
||||||
|
}
|
||||||
|
|
||||||
|
return edgeFn(FunctionFunc(func(inArg Value) Value {
|
||||||
|
|
||||||
|
if pred := inEdgeFns[0].Perform(inArg); pred.Equal(valNumberZero) {
|
||||||
|
return inEdgeFns[2].Perform(inArg)
|
||||||
|
}
|
||||||
|
|
||||||
|
return inEdgeFns[1].Perform(inArg)
|
||||||
|
|
||||||
|
})), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// "!if" statements (above) are the only case where we want the
|
||||||
|
// input edges to remain separated, otherwise they should always
|
||||||
|
// be combined into a single edge whose value is a tuple. Do
|
||||||
|
// that here.
|
||||||
|
|
||||||
|
inEdgeFn := inEdgeFns[0]
|
||||||
|
|
||||||
|
if len(inEdgeFns) > 1 {
|
||||||
|
inEdgeFn = edgeFn(FunctionFunc(func(inArg Value) Value {
|
||||||
|
tupVals := make([]Value, len(inEdgeFns))
|
||||||
|
|
||||||
|
for i := range inEdgeFns {
|
||||||
|
tupVals[i] = inEdgeFns[i].Perform(inArg)
|
||||||
|
}
|
||||||
|
|
||||||
|
return Tuple(tupVals...)
|
||||||
|
}))
|
||||||
|
}
|
||||||
|
|
||||||
|
var edgeVal Value
|
||||||
|
if ggEdgeVal.Valid {
|
||||||
|
edgeVal.Value = ggEdgeVal.Value
|
||||||
|
}
|
||||||
|
|
||||||
|
if edgeVal.IsZero() {
|
||||||
|
return inEdgeFn, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
if edgeVal.Equal(valNameRecur) {
|
||||||
|
return edgeFn(FunctionFunc(func(inArg Value) Value {
|
||||||
|
return (*thisFn).Perform(inEdgeFn.Perform(inArg))
|
||||||
|
})), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
if edgeVal.Graph != nil {
|
||||||
|
|
||||||
|
opFromGraph, err := FunctionFromGraph(edgeVal.Graph, scope)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("compiling graph to operation: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
edgeVal = Value{Function: opFromGraph}
|
||||||
|
}
|
||||||
|
|
||||||
|
// The Function is known at compile-time, so we can wrap it
|
||||||
|
// directly into an edgeVal using the existing inEdgeFn as the
|
||||||
|
// input.
|
||||||
|
if edgeVal.Function != nil {
|
||||||
|
return edgeFn(FunctionFunc(func(inArg Value) Value {
|
||||||
|
return edgeVal.Function.Perform(inEdgeFn.Perform(inArg))
|
||||||
|
})), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// the edgeVal is not a Function at compile time, and so
|
||||||
|
// it must become one at runtime. We must resolve edgeVal to an
|
||||||
|
// edgeFn as well (edgeEdgeFn), and then at runtime that is
|
||||||
|
// given the inArg and (hopefully) the resultant Function is
|
||||||
|
// called.
|
||||||
|
|
||||||
|
edgeEdgeFn, err := valToEdgeFn(edgeVal)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return edgeFn(FunctionFunc(func(inArg Value) Value {
|
||||||
|
|
||||||
|
runtimeEdgeVal := edgeEdgeFn.Perform(inArg)
|
||||||
|
|
||||||
|
if runtimeEdgeVal.Graph != nil {
|
||||||
|
|
||||||
|
runtimeFn, err := FunctionFromGraph(
|
||||||
|
runtimeEdgeVal.Graph, scope,
|
||||||
|
)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
panic(fmt.Sprintf("compiling graph to operation: %v", err))
|
||||||
|
}
|
||||||
|
|
||||||
|
runtimeEdgeVal = Value{Function: runtimeFn}
|
||||||
|
}
|
||||||
|
|
||||||
|
if runtimeEdgeVal.Function == nil {
|
||||||
|
panic("edge value must be an operation")
|
||||||
|
}
|
||||||
|
|
||||||
|
return runtimeEdgeVal.Function.Perform(inEdgeFn.Perform(inArg))
|
||||||
|
|
||||||
|
})), nil
|
||||||
|
},
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
graphFn, err := valToEdgeFn(valNameOut)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
*thisFn = Function(graphFn)
|
||||||
|
|
||||||
|
return Function(graphFn), nil
|
||||||
|
}
|
58
vm/scope.go
Normal file
58
vm/scope.go
Normal file
@ -0,0 +1,58 @@
|
|||||||
|
package vm
|
||||||
|
|
||||||
|
import (
|
||||||
|
"errors"
|
||||||
|
)
|
||||||
|
|
||||||
|
// ErrNameNotDefined is returned from Scope.Resolve when a name could not be
|
||||||
|
// resolved within a Scope.
|
||||||
|
var ErrNameNotDefined = errors.New("not defined")
|
||||||
|
|
||||||
|
// Scope encapsulates a set of name->Value mappings.
|
||||||
|
type Scope interface {
|
||||||
|
|
||||||
|
// Resolve accepts a name and returns an Value, or returns
|
||||||
|
// ErrNameNotDefined.
|
||||||
|
Resolve(string) (Value, error)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ScopeMap implements the Scope interface.
|
||||||
|
type ScopeMap map[string]Value
|
||||||
|
|
||||||
|
var _ Scope = ScopeMap{}
|
||||||
|
|
||||||
|
// Resolve uses the given name as a key into the ScopeMap map, and
|
||||||
|
// returns the Operation held there for the key, if any.
|
||||||
|
func (m ScopeMap) Resolve(name string) (Value, error) {
|
||||||
|
|
||||||
|
v, ok := m[name]
|
||||||
|
|
||||||
|
if !ok {
|
||||||
|
return Value{}, ErrNameNotDefined
|
||||||
|
}
|
||||||
|
|
||||||
|
return v, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
type scopeWith struct {
|
||||||
|
Scope // parent
|
||||||
|
name string
|
||||||
|
val Value
|
||||||
|
}
|
||||||
|
|
||||||
|
// ScopeWith returns a copy of the given Scope, except that evaluating the given
|
||||||
|
// name will always return the given Value.
|
||||||
|
func ScopeWith(scope Scope, name string, val Value) Scope {
|
||||||
|
return &scopeWith{
|
||||||
|
Scope: scope,
|
||||||
|
name: name,
|
||||||
|
val: val,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *scopeWith) Resolve(name string) (Value, error) {
|
||||||
|
if name == s.name {
|
||||||
|
return s.val, nil
|
||||||
|
}
|
||||||
|
return s.Scope.Resolve(name)
|
||||||
|
}
|
61
vm/scope_global.go
Normal file
61
vm/scope_global.go
Normal file
@ -0,0 +1,61 @@
|
|||||||
|
package vm
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
|
||||||
|
"code.betamike.com/mediocregopher/ginger/gg"
|
||||||
|
)
|
||||||
|
|
||||||
|
func globalFn(fn func(Value) (Value, error)) Value {
|
||||||
|
return Value{
|
||||||
|
Function: FunctionFunc(func(in Value) Value {
|
||||||
|
res, err := fn(in)
|
||||||
|
if err != nil {
|
||||||
|
panic(err)
|
||||||
|
}
|
||||||
|
return res
|
||||||
|
}),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// GlobalScope contains operations and values which are available from within
|
||||||
|
// any operation in a ginger program.
|
||||||
|
var GlobalScope = ScopeMap{
|
||||||
|
|
||||||
|
"!add": globalFn(func(val Value) (Value, error) {
|
||||||
|
|
||||||
|
var sum int64
|
||||||
|
|
||||||
|
for _, tupVal := range val.Tuple {
|
||||||
|
|
||||||
|
if tupVal.Number == nil {
|
||||||
|
return Value{}, fmt.Errorf("add requires a non-zero tuple of numbers as an argument")
|
||||||
|
}
|
||||||
|
|
||||||
|
sum += *tupVal.Number
|
||||||
|
}
|
||||||
|
|
||||||
|
return Value{Value: gg.Value{Number: &sum}}, nil
|
||||||
|
|
||||||
|
}),
|
||||||
|
|
||||||
|
"!tupEl": globalFn(func(val Value) (Value, error) {
|
||||||
|
|
||||||
|
tup, i := val.Tuple[0], val.Tuple[1]
|
||||||
|
|
||||||
|
return tup.Tuple[int(*i.Number)], nil
|
||||||
|
|
||||||
|
}),
|
||||||
|
|
||||||
|
"!isZero": globalFn(func(val Value) (Value, error) {
|
||||||
|
|
||||||
|
if *val.Number == 0 {
|
||||||
|
one := int64(1)
|
||||||
|
return Value{Value: gg.Value{Number: &one}}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
zero := int64(0)
|
||||||
|
return Value{Value: gg.Value{Number: &zero}}, nil
|
||||||
|
|
||||||
|
}),
|
||||||
|
}
|
190
vm/vm.go
190
vm/vm.go
@ -1,129 +1,123 @@
|
|||||||
|
// Package vm implements the execution of gg.Graphs as programs.
|
||||||
package vm
|
package vm
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"errors"
|
"errors"
|
||||||
"fmt"
|
"fmt"
|
||||||
"sync"
|
"io"
|
||||||
|
"strings"
|
||||||
|
|
||||||
"github.com/mediocregopher/ginger/lang"
|
"code.betamike.com/mediocregopher/ginger/gg"
|
||||||
|
"code.betamike.com/mediocregopher/ginger/graph"
|
||||||
"llvm.org/llvm/bindings/go/llvm"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
// Types supported by the vm in addition to those which are part of lang
|
// ZeroValue is a Value with no fields set. It is equivalent to the 0-tuple.
|
||||||
var (
|
var ZeroValue Value
|
||||||
Atom = lang.AAtom
|
|
||||||
Tuple = lang.ATuple
|
|
||||||
Int = lang.Atom("int")
|
|
||||||
)
|
|
||||||
|
|
||||||
// Ops supported by the vm
|
// Value extends a gg.Value to include Functions and Tuples as a possible
|
||||||
var (
|
// types.
|
||||||
Add = lang.Atom("add")
|
type Value struct {
|
||||||
Assign = lang.Atom("assign")
|
gg.Value
|
||||||
Var = lang.Atom("var")
|
|
||||||
)
|
|
||||||
|
|
||||||
////////////////////////////////////////////////////////////////////////////////
|
Function
|
||||||
|
Tuple []Value
|
||||||
// Module contains a compiled set of code which can be run, dumped in IR form,
|
|
||||||
// or compiled. A Module should be Dispose()'d of once it's no longer being
|
|
||||||
// used.
|
|
||||||
type Module struct {
|
|
||||||
b llvm.Builder
|
|
||||||
m llvm.Module
|
|
||||||
ctx varCtx
|
|
||||||
mainFn llvm.Value
|
|
||||||
}
|
}
|
||||||
|
|
||||||
var initOnce sync.Once
|
// Tuple returns a tuple Value comprising the given Values. Calling Tuple with
|
||||||
|
// no arguments returns ZeroValue.
|
||||||
// Build creates a new Module by compiling the given Terms as code
|
func Tuple(vals ...Value) Value {
|
||||||
// TODO only take in a single Term, implement List and use that with a do op
|
return Value{Tuple: vals}
|
||||||
func Build(tt ...lang.Term) (*Module, error) {
|
|
||||||
initOnce.Do(func() {
|
|
||||||
llvm.LinkInMCJIT()
|
|
||||||
llvm.InitializeNativeTarget()
|
|
||||||
llvm.InitializeNativeAsmPrinter()
|
|
||||||
})
|
|
||||||
mod := &Module{
|
|
||||||
b: llvm.NewBuilder(),
|
|
||||||
m: llvm.NewModule(""),
|
|
||||||
ctx: varCtx{},
|
|
||||||
}
|
|
||||||
|
|
||||||
var err error
|
|
||||||
if mod.mainFn, err = mod.buildFn(tt...); err != nil {
|
|
||||||
mod.Dispose()
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
if err := llvm.VerifyModule(mod.m, llvm.ReturnStatusAction); err != nil {
|
|
||||||
mod.Dispose()
|
|
||||||
return nil, fmt.Errorf("could not verify module: %s", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
return mod, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Dispose cleans up all resources held by the Module
|
// IsZero returns true if the Value is the zero value (aka the 0-tuple).
|
||||||
func (mod *Module) Dispose() {
|
// LexerToken (within the gg.Value) is ignored for this check.
|
||||||
// TODO this panics for some reason...
|
func (v Value) IsZero() bool {
|
||||||
//mod.m.Dispose()
|
return v.Equal(ZeroValue)
|
||||||
//mod.b.Dispose()
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// TODO make this return a val once we get function types
|
// Equal returns true if the passed in Value is equivalent, ignoring the
|
||||||
func (mod *Module) buildFn(tt ...lang.Term) (llvm.Value, error) {
|
// LexerToken on either Value.
|
||||||
if len(tt) == 0 {
|
//
|
||||||
return llvm.Value{}, errors.New("function cannot be empty")
|
// Will panic if the passed in v2 is not a Value from this package.
|
||||||
}
|
func (v Value) Equal(v2g graph.Value) bool {
|
||||||
|
|
||||||
ops := make([]op, len(tt))
|
v2 := v2g.(Value)
|
||||||
var err error
|
|
||||||
for i := range tt {
|
switch {
|
||||||
if ops[i], err = termToOp(mod.ctx, tt[i]); err != nil {
|
|
||||||
return llvm.Value{}, err
|
case (v.Value != (gg.Value{}) || v2.Value != (gg.Value{})):
|
||||||
|
return v.Value.Equal(v2.Value)
|
||||||
|
|
||||||
|
case v.Function != nil || v2.Function != nil:
|
||||||
|
// for now we say that Functions can't be compared. This will probably
|
||||||
|
// get revisted later.
|
||||||
|
return false
|
||||||
|
|
||||||
|
case len(v.Tuple) == len(v2.Tuple):
|
||||||
|
|
||||||
|
for i := range v.Tuple {
|
||||||
|
if !v.Tuple[i].Equal(v2.Tuple[i]) {
|
||||||
|
return false
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
var llvmIns []llvm.Type
|
return true
|
||||||
if in := ops[0].inType(); in.llvm.TypeKind() == llvm.VoidTypeKind {
|
|
||||||
llvmIns = []llvm.Type{}
|
|
||||||
} else {
|
|
||||||
llvmIns = []llvm.Type{in.llvm}
|
|
||||||
}
|
|
||||||
llvmOut := ops[len(ops)-1].outType().llvm
|
|
||||||
|
|
||||||
fn := llvm.AddFunction(mod.m, "", llvm.FunctionType(llvmOut, llvmIns, false))
|
default:
|
||||||
block := llvm.AddBasicBlock(fn, "")
|
|
||||||
mod.b.SetInsertPoint(block, block.FirstInstruction())
|
|
||||||
|
|
||||||
var out llvm.Value
|
// if both were the zero value then both tuples would have the same
|
||||||
for i := range ops {
|
// length (0), which is covered by the previous check. So anything left
|
||||||
if out, err = ops[i].build(mod); err != nil {
|
// over must be tuples with differing lengths.
|
||||||
return llvm.Value{}, err
|
return false
|
||||||
}
|
}
|
||||||
}
|
|
||||||
mod.b.CreateRet(out)
|
|
||||||
return fn, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Dump dumps the Module's IR to stdout
|
func (v Value) String() string {
|
||||||
func (mod *Module) Dump() {
|
|
||||||
mod.m.Dump()
|
switch {
|
||||||
|
|
||||||
|
case v.Function != nil:
|
||||||
|
|
||||||
|
// We can try to get better strings for ops later
|
||||||
|
return "<fn>"
|
||||||
|
|
||||||
|
case v.Value != (gg.Value{}):
|
||||||
|
return v.Value.String()
|
||||||
|
|
||||||
|
default:
|
||||||
|
|
||||||
|
// we consider zero value to be the 0-tuple
|
||||||
|
|
||||||
|
strs := make([]string, len(v.Tuple))
|
||||||
|
|
||||||
|
for i := range v.Tuple {
|
||||||
|
strs[i] = v.Tuple[i].String()
|
||||||
|
}
|
||||||
|
|
||||||
|
return fmt.Sprintf("(%s)", strings.Join(strs, ", "))
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Run executes the Module
|
// EvaluateSource reads and parses the io.Reader as an operation, input is used
|
||||||
// TODO input and output?
|
// as the argument to the operation, and the resultant value is returned.
|
||||||
func (mod *Module) Run() (interface{}, error) {
|
//
|
||||||
engine, err := llvm.NewExecutionEngine(mod.m)
|
// scope contains pre-defined operations and values which are available during
|
||||||
|
// the evaluation.
|
||||||
|
func EvaluateSource(opSrc io.Reader, input Value, scope Scope) (Value, error) {
|
||||||
|
v, err := gg.NewDecoder(opSrc).Next()
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return Value{}, err
|
||||||
|
} else if v.Value.Graph == nil {
|
||||||
|
return Value{}, errors.New("value must be a graph")
|
||||||
}
|
}
|
||||||
defer engine.Dispose()
|
|
||||||
|
|
||||||
funcResult := engine.RunFunction(mod.mainFn, []llvm.GenericValue{})
|
fn, err := FunctionFromGraph(v.Value.Graph, scope)
|
||||||
defer funcResult.Dispose()
|
if err != nil {
|
||||||
return funcResult.Int(false), nil
|
return Value{}, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return fn.Perform(input), nil
|
||||||
}
|
}
|
||||||
|
106
vm/vm_test.go
106
vm/vm_test.go
@ -1,84 +1,58 @@
|
|||||||
package vm
|
package vm
|
||||||
|
|
||||||
import (
|
import (
|
||||||
. "testing"
|
"bytes"
|
||||||
|
"strconv"
|
||||||
|
"testing"
|
||||||
|
|
||||||
"github.com/mediocregopher/ginger/lang"
|
"code.betamike.com/mediocregopher/ginger/gg"
|
||||||
|
"github.com/stretchr/testify/assert"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestCompiler(t *T) {
|
func TestVM(t *testing.T) {
|
||||||
mkcmd := func(a lang.Atom, args ...lang.Term) lang.Tuple {
|
tests := []struct {
|
||||||
// TODO a 1-tuple should be the same as its element?
|
src string
|
||||||
if len(args) == 1 {
|
in Value
|
||||||
return lang.Tuple{a, args[0]}
|
exp Value
|
||||||
}
|
expErr string
|
||||||
return lang.Tuple{a, lang.Tuple(args)}
|
}{
|
||||||
}
|
|
||||||
mkint := func(i string) lang.Tuple {
|
|
||||||
return lang.Tuple{Int, lang.Const(i)}
|
|
||||||
}
|
|
||||||
|
|
||||||
type test struct {
|
|
||||||
in []lang.Term
|
|
||||||
exp uint64
|
|
||||||
}
|
|
||||||
|
|
||||||
one := mkint("1")
|
|
||||||
two := mkint("2")
|
|
||||||
foo := mkcmd(Var, lang.Atom("foo"))
|
|
||||||
bar := mkcmd(Var, lang.Atom("bar"))
|
|
||||||
baz := mkcmd(Var, lang.Atom("baz"))
|
|
||||||
|
|
||||||
tests := []test{
|
|
||||||
{
|
{
|
||||||
in: []lang.Term{one},
|
src: `{
|
||||||
exp: 1,
|
incr = { !out = !add < (1, !in); };
|
||||||
|
!out = incr < incr < !in;
|
||||||
|
}`,
|
||||||
|
in: Value{Value: gg.Number(5)},
|
||||||
|
exp: Value{Value: gg.Number(7)},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
in: []lang.Term{
|
src: `{
|
||||||
mkcmd(Add, mkcmd(Tuple, one, two)),
|
!foo = in;
|
||||||
},
|
!out = !foo;
|
||||||
exp: 3,
|
}`,
|
||||||
|
in: Value{Value: gg.Number(1)},
|
||||||
|
expErr: "name !foo cannot start with a '!'",
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
in: []lang.Term{
|
src: `{foo = bar; !out = foo;}`,
|
||||||
mkcmd(Assign, foo, one),
|
in: Value{},
|
||||||
mkcmd(Add, mkcmd(Tuple, foo, two)),
|
exp: Value{Value: gg.Name("bar")},
|
||||||
},
|
|
||||||
exp: 3,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
in: []lang.Term{
|
|
||||||
mkcmd(Assign, foo, mkcmd(Tuple, one, two)),
|
|
||||||
mkcmd(Add, foo),
|
|
||||||
},
|
|
||||||
exp: 3,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
in: []lang.Term{
|
|
||||||
mkcmd(Assign, foo, mkcmd(Tuple, one, two)),
|
|
||||||
mkcmd(Assign, bar, mkcmd(Add, foo)),
|
|
||||||
mkcmd(Assign, baz, mkcmd(Add, foo)),
|
|
||||||
mkcmd(Add, mkcmd(Tuple, bar, baz)),
|
|
||||||
},
|
|
||||||
exp: 6,
|
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
for _, test := range tests {
|
for i, test := range tests {
|
||||||
t.Logf("testing program: %v", test.in)
|
t.Run(strconv.Itoa(i), func(t *testing.T) {
|
||||||
mod, err := Build(test.in...)
|
t.Log(test.src)
|
||||||
if err != nil {
|
val, err := EvaluateSource(
|
||||||
t.Fatalf("building failed: %s", err)
|
bytes.NewBufferString(test.src), test.in, GlobalScope,
|
||||||
}
|
)
|
||||||
|
|
||||||
out, err := mod.Run()
|
if test.expErr != "" {
|
||||||
if err != nil {
|
assert.Error(t, err)
|
||||||
mod.Dump()
|
assert.Equal(t, test.expErr, err.Error())
|
||||||
t.Fatalf("running failed: %s", err)
|
} else {
|
||||||
} else if out != test.exp {
|
assert.NoError(t, err)
|
||||||
mod.Dump()
|
assert.True(t, val.Equal(test.exp), "%v != %v", test.exp, val)
|
||||||
t.Fatalf("expected result %T:%v, got %T:%v", test.exp, test.exp, out, out)
|
|
||||||
}
|
}
|
||||||
|
})
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
Loading…
Reference in New Issue
Block a user