Lexon is a plain-text programming language for law and smart contracts. It is symbolic AI.
Lexon is the first of a new generation of languages. Its grammar describes the intersection of natural human language and higher order logic in the way that Wittgenstein demanded. To a degree, it ends the quest for an unambiguous universal language for philosophy and pure thought as envisioned by Leibniz, Frege, or Russel. The key to this is how Lexon maps natural language to compiler building tools, which is intuitively convincing, and in line with what the tools were designed for, but different from what computer sciences had gotten used to.
Lexon is the language that Robotic Laws will be articulated in, to embed unambiguous limitations into autonomous machines, written by lawmakers, the code being official law, created and approved in the democratic process. Lexon thus solves a long-standing question of Computational Law. It works for blockchain smart contracts as well as off-line, and also off-machine. For its advantages in transparency and accessibility, it may become a mainstream programming language. Beyond its uses in connection with computers, it may over time replace today's legalese as a more useful language for law and contracting. The counter arguments of the legal profession are addressed in the available publications that discuss Lexon. The work of professors of law about Lexon may serve as invitation to imagine that progress is possible, also for a two thousand years old industry.
Lexon is the language of lawmakers and programmers alike, enabling the coming profession of the legal engineer.
Lexon is under development. There is no consistent distribution available yet for the current compiler, which is the fourth implementation, adding significant depth. Documents, demos and source codes currently reflect a mix of the grammar versions 0.2 and 0.3.
If you are new to the concept, please read on. If you have the time, consider turning to the book.
This is an example of Lexon code:
Anyone can read this text and understand what it means. It can be shown to a judge, it can be understood by business partners and customers as well as a company’s management and legal department; and it can also – as is – be run as a program, for example on a blockchain, i.e. as smart contract.
Soon, any type of program can be written this way. And any type of agreement can be automated and made impossible to be broken. This will uncouple business necessities from the judicative and executive powers, their astronomical costs and glacial speed. Digital Contracts cost pennies to set in motion and can securely make any sum of money change hands in minutes. This will be a game changer for a massive slice of commercial activity and enable a long tail of private trade. It will also change the standards for governance and government.
Blockchain technology was made by hackers for hackers – but with Lexon, anyone can read programs now without any knowledge of programming. And thus, consumers, as well as businesspeople, judges, jury members, even lawmakers can read any smart contract about which they might be tasked to decide, investigate, legislate, to verify or enter. Through this, contracting may become part of the definition of literacy and a silver arrow in the quiver of democracy.
As lawyers confirm, the code above is a legally enforceable contract: it can be used to demonstrate to a judge what the meeting of the minds of the parties to the contract was. There are no style requirements for a contract. There can't be any, or else a typo or poor grasp of grammar could render con- tracts invalid. But smart contract code e.g. written in Solidity or Sophia would always lead to a battle of experts if brought to court because non-programmers cannot read them.
Not all contracts need to be in writing. The ‘contract’ itself is always the abstract agreement of two parties, no matter how it was expressed. A signed paper merely proves it. Now, a readable, digitally signed program can prove and perform this will.
A comparison with other computer languages that strive to be more broadly readable, as well as with popular artificial human languages can be found in the appendix of the Lexon book.
Interesting questions can be asked when comparing Lexon to Fuchs' Attempto Controlled English (ACE) and Kowalski's Logical English. The more important difference might be that both projects start from first order logic, while Lexon was from the start focussed on higher order logic. A high level view might be helpful:
ACE is a successful attempt to articulate first order logic in natural language. This is not necessarily enough to achieve programmability or to capture the complexity of legal prose. At heart, the question is, in how far our thinking and our languages are structured along the lines of first order logic in the first place. In fact, an entire branch of AI wrestled with the challenge of reconciling logic and thinking for decades; and prof. Kowalski, a passionate logician, pioneered work that explored what could be termed the less logical side of our thinking, namely abductive logic. Lexon makes it possible to express more complex thoughts than first order logic, as it starts from programming, which is higher order logic. Like ACE, this is still only a subset of how we think and express ourselves but a significantly larger one that covers the breadth of the language of science and business.
The mainstay AI language Prolog, which Kowalski co-invented, is the basis for Logical English. Principally, Prolog strives to replace some aspects of programming. The magic it works allows for a logician to state rules, the 'what', instead of a programmer having to also articulate the 'how'. This is a feature that is also successfully employed with SQL, which for decades now is the dominant language in the IT industry to express data access. Lexon is a more modest approach that focusses on natural language first and creates a bridge between most natural looking prose, and mainstream 3rd generation programming languages as used in the industry today. It inherits and fully uses the latters' notion of state, with an explicit concept of time and progress. The fact that Lexon 'looks' declarative is owed to the hybrid nature of legal agreements: they do in fact state rules but sometimes processes. And a core discovery during the invention of Prolog was that rules can be interpreted as instructions. Lexon uses the convention found for Prolog and Functional Programming to use short functions that per se read declaratively, to enable the user to trigger a flow that is mostly controlled by the sequence that functions are called in by the user. Centrally for this, state is preserved between calls and the main focus for user and system is how the state changes over time. Structurally this equates Lexon functions (clauses) with Prolog queries, but not with Prolog clauses, and reflects the reality both of public functions of smart contracts – e.g. in Ethereum Solidity – and clauses in legal agreements where they focus on expectable events.
The 2023 compiler 0.3 features significant updates that faithfully realize the vision of the 2017 whitepaper. The readability of the latest Lexon syntax is higher than originally expected. The complexity of the grammar is much higher than the 2017 starting point, approaching that of a 3rd generation programming language but in light of its application, remarkable simple. The invention of LGF (see below) helps keeping it succinct and highly readable.
Today's fourth implementation of the compiler uses standard C compiler building technology for an overall build chain that is deeper than usual but ultimately remains slim and tidy for a compiler. It is based on Backus-Naur-Form (BNF) again after Parsing Expression Grammar (PEG) appeared to hamper progress. In terms of IT projects, the compiler remains a small program and features minimal dependencies, making it portable across systems. Builds are tested on Darwin UNIX (Mac) and Ubuntu Linux.
An important element of the overall Lexon build chain is the Lexon Grammar Form (LGF) that modestly extends the Backus-Naur-Form (BNF) to better capture the variability of natural language. The LGF compiler transpiles LGF to BNF and is included in the Lexon compiler.
Size and complexity of examples is fair, enough to break the Ethereum smart contract limit, or to implement an entire UCC form. Lexon will scale out because it is built with the same tools and techniques as modern C, C++, C# and Java compilers. Its challenges are no different and its maximum code size will be comparable. The size of Lexon examples is arguably higher than that of other projects.
For the upcoming grammar and compiler versions 0.4, targets have been defined using a concrete DeFi DAO project.
Prof. C. C. Clack of the University College London compares Lexon to other approaches and stresses "the importance of language design in the development of reliable Smart Contracts" in his research into the "gap that exists between the disciplines of law and computer science." He singles out Kowalski's Logical English and Lexon as the efforts that lawyers might be comfortable with, as they both employ controlled English, i.e., a reduced form of natural language.
Asst. prof. Carla Reyes, who is the Chair of the Texas Work Group on Blockchain Matters as well as the Research Director for the Uniform Law Commission’s Technology Committee, makes a proposal how to rewrite a part of the US Unified Commercial Code in Lexon, i.e., to articulate US model trade law in Lexon. The Lexon code would be executed by the agencies that record claims on collateral for loans to prevent creditors from lending twice against the same collateral.
The Lexon approach is independent of a specific natural language and the Lexon grammar compiler exists to allow for a multitude of natural languages to be implemented in a way analog to how the English grammar has been realized. But small changes to the compiler itself will, as a rule, be neccessary to implement a new language, except where it is a dialect or very close to one that is already implemented. It would be possible to write different grammars for British, Indian and American English without changes to the core compiler code. But the compiler will have to be extended, for instance, to process a grammar for a natural language that features gender suffixes. The realistic expectation is that the compiler will be updated for every new language group that will be added, e.g.: Scandinavian, Romanic, etc.
Lexon has been tested for English, German and Japanese. The indication is that it will work for most languages, with English being one of the least challenging cases; Chinese, Japanese, German, Spanish and Italian can be expected to be moderately challenging to implement and the Eastern European languages, for their richness of cases and numbers, the hardest. Even the English grammar of Lexon is, throughout, a prescription to leave information (about cases and number) out in a controlled manner. Other languages add gender, more cases, numbers and times that have to be grouped into sets of equivalent meaning.
Contributors to this work are welcome.
The Lexon compiler could target most 3
Lexon allows for all participants of a DAO to read the smart contract code that actually works on-chain, and for this code itself to become the legally binding agreement, the bylaws or operating agreement, if the DAO is incorporated.
The increase in transparency and inclusivity by using human-readable code is obvious, though the extent of improvements this allows for workflows and the democratic process might only show over time. Fundamentally, Lexon abandons the privilege of programmers to be able to have full insight into the implementation level of the core set of DAO rules. The concrete benefits are to have more eyes on the actual codes for verification and error prevention, and to have a much faster and larger feedback cycle for double checking the smart contracts against the underlying intentions of the entire group. This advantage olds, of course, whether the DAO is incorporated or not. It is arguably more important for a DAO that is not incorporated as its members will often be personally liable for each of t he DAOs actions and want to fully understand what the rules are.
Some DAOs will consider incorporation to shield their members from legal liability and to officially hold assets thanks to being a legal person, recognized by the jurisdiction they incorporate in and having legal standing. Liabillity of members of a DAO can be overlooked in early stages of a project but it usually includes all members being liable to every action of the DAO with the entirety of their personal assets. Incorporating the DAO will often reduce this liability to the assets invested in the DAO as the maximum loss that can be incurred by legal action against the DAO. If there is no investment, it is zero even though the member might have voting rights to influence the DAO's actions. An incorporated DAO can also legally own the asset it controls, with consequences for taxation that can be relevant in certain jurisdictions. The difference Lexon can make for these scenarios is that the smart contract code can itself be the bylaw or operating agreement that governs the DAO company after incorporation.
The Oversimplicated DAO is an example for a DAO written in Lexon that uses the maximal Ethereum contract size.
The Lexon compiler may be an AI tool in the original sense. It helps solving a problem that the first generation of AI research found to be an insurmountable challenge, as well as the currently mainstream, analytical philosophy: the seeming intractability of natural language. Importantly, it does so in a way that is scalable. Lexon addresses this problem first and proceeds from there, using techniques that were used in and in part created for AI. Contrary to appearance, Lexon is fundamentally not word-centric: it does not operate on the preconceived meaning of words but on the way they are used, as the late Wittgenstein demanded. This leaves axioms of analytical philosphy behind that informed early AI efforts.
The definition of AI has changed materially over the past decades, for marketing reasons, and the current use of the label might have been called statistics by the first generation of AI researchers, who were focused on creating general artificical intelligence. In the long run, Lexon might be the key to achieving general AI in combination with machine learning heuristics. In itself it is exactly the opposite of machine learning: Lexon texts are transparent, self-explanatory and provide full agency. They are performed deterministically rather than with a chance of error. This addresses exactly the problematic shortfalls of machine learning that receive more attention today. That is, Lexon provides what is missing in the current, scaled-back notion of AI.
The potential for change that Lexon untaps is arguably higher than what machine learning adds to the various industries that make use of it. The advances in governance, trade and law that it provides by a new quality of robustness, speed and accessibility will lead to the merging of professions, the empowerment of some and the eliminations of others, while increasing productivity and democratic participation in a more direct manner than the change that is driven by machine learning today.