Edit on GitHub


This is a quick overview of the language features that are currently implemented by the examples.

All of the features and examples presented here should work with the latest Nelua version. For features not yet implemented see the draft.

A note for Lua users

Most of Nelua’s syntax and semantics are similar to Lua, thus, if you know Lua, you probably know Nelua as well. However, Nelua has many additions, such as code with type notations, to make code more efficient and to allow metaprogramming. This overview will try to focus on those features.

There is no interpreter or VM, all of the code is converted directly into native machine code, so expect better efficiency than Lua. However, this means that Nelua cannot load code generated at runtime, the user is encouraged to generate code at compile-time using the preprocessor.

Although copying Lua syntax and semantics with minor changes is a goal of Nelua, not all Lua features are implemented yet. Most of the dynamic parts such as tables and handling dynamic types at runtime are not implemented yet, so at the moment, using records instead of tables and type notations in function definitions are recommended. Visit this page for the full list of available features.

A note for C users

Nelua tries to expose most of C’s features without overhead, so expect to get near C performance when coding in the C style; that is, using type notations, manual memory management, pointers and records (structs).

The semantics are not exactly the same as C semantics, but they are close. There are slight differences to minimize undefined behaviors (like initializing to zero by default) and others to maintain consistency with Lua semantics (like integer division rounding towards negative infinity). However, there are ways to use C semantics when needed.

The preprocessor is much more powerful than C’s preprocessor, because it is actually the compiler running in Lua, so you can interact with the compiler during parsing. The preprocessor should be used for making generic code, code specialization, and avoiding code duplication.

Nelua compiles everything into a single readable C file. If you know C, it is recommended that you read the generated C code to learn more about what exactly the compiler outputs.

Hello world

A simple hello world program is just the same as in Lua:

print 'Hello world!'


Comments are just like Lua:

-- one line comment
  multi-line comment
  multi line comment, `=` can be placed multiple times
  in case if you have `[[` `]]` tokens inside, it will
  always match it's corresponding token


Variables are declared or defined like in Lua, but you may optionally specify a type when declaring:

local a = nil -- of deduced type 'any', initialized to nil
local b = false -- of deduced type 'boolean', initialized to false
local s = 'test' -- of deduced type 'string', initialized to 'test'
local one = 1 --  of type 'integer', initialized to 1
local pi: number = 3.14 --  of type 'number', initialized to 1
print(a,b,s,one,pi) -- outputs: nil false test 1 3.1400000

The compiler takes advantage of types for compile-time and runtime checks and to generate efficient code for the specific type used.

Type deduction

When a variable has no specified type on its declaration, the type is automatically deduced and resolved at compile-time:

local a -- type will be deduced at scope end
a = 1
a = 2
print(a) -- outputs: 2
-- end of scope, compiler deduced 'a' to be of type 'integer'

The compiler does the best it can to deduce the type for you. In most situations it should work, but in some corner cases you may want to explicitly set a type for a variable.

Type collision

In the case of different types being assigned to the same variable, the compiler deduces the variable type to be the any type, a type that can hold anything at runtime. This makes Nelua code compatible with Lua semantics:

local a -- a type will be deduced
a = 2
a = false
print(a) -- outputs: false
-- a is deduced to be of type 'any', because it could hold an 'integer' or a 'boolean'

The any type is poorly supported at the moment, so please avoid this situation for now, that is, avoid making the compiler deduce types collisions that would result in an any type. Usually, you don’t want use any types anyway, as they are less efficient. Collision between different numeric types are fine, as the compiler always resolves to the largest appropriate number type.

Zero initialization

Variables declared but not defined early are always initialized to zeros by default. This prevents undefined behaviors:

local a -- variable of deduced type 'any', initialized to 'nil'
local i: integer -- variable of type 'integer', initialized to 0
print(a, i) --outputs: nil 0

Zero initialization can be optionally disabled using the <noinit> annotation. Although not advised, one could do this for micro optimization purposes.

Auto variables

A variable declared as auto has its type deduced early based only on the type of its first assignment:

local a: auto = 1 -- a is deduced to be of type 'integer'

-- uncommenting the following will trigger the compile error:
--   error: in variable assignment: no viable type conversion from `boolean` to `int64`
--a = false

print(a) -- outputs: 1

Auto variables were not intended to be used in variable declarations like in the example above, because in most cases you can omit the type and the compiler will automatically deduce it. This can be used, however, if you want the compiler to deduce early. The auto type was mainly created to be used with polymorphic functions.

Compile-time variables

Compile-time variables have their values known at compile-time:

local a <comptime> = 1 + 2 -- constant variable of value '3' evaluated and known at compile-time

The compiler takes advantage of compile-time variables to generate efficient code, because compile-time variables can be processed at compile-time. Compile-time variables are also useful as compile-time parameters in polymorphic functions.

Const variables

Const variables can be assigned once at runtime, however, they cannot mutate:

local x <const> = 1
local a <const> = x
print(a) -- outputs: 1

-- uncommenting the following will trigger the compile error:
--   error: cannot assign a constant variable
--a = 2

The const annotation can also be used for function arguments.

The use of <const> annotation is mostly for aesthetic purposes. Its usage does not affect efficiency.


Symbols are named identifiers for functions, types, and variables.

Local symbol

Local symbols are only visible in the current and inner scopes:

  local a = 1
    print(a) -- outputs: 1
-- uncommenting this would trigger a compiler error because `a` is not visible:
-- a = 1

Global symbol

Global symbols are visible in other source files. They can only be declared in the top scope:

global global_a = 1
global function global_f()

If the above is saved into a file in the same directory as globals.nelua, then we can run:

require 'globals'
print(global_a) -- outputs: 1
print(global_f()) -- outputs: f

Unlike Lua, to declare a global variable you must explicitly use the global keyword.

Symbols with special characters

A symbol identifier, that is, the symbol name, can contain UTF-8 special characters:

local π = 3.14
print(π) -- outputs 3.14

Control flow

Nelua provides the same control flow mechanisms as Lua, plus some additional ones to make low level programming easier, like switch, defer, and continue statements.


If statements work just like in Lua:

local a = 1 -- change this to 2 or 3 to trigger other ifs
if a == 1 then
  print 'is one'
elseif a == 2 then
  print 'is two'
  print('not one or two')


The switch statement is similar to C:

local a = 1 -- change this to 2 or 3 to trigger other ifs
switch a do
case 1 then
  print 'is 1'
case 2, 3 then
  print 'is 2 or 3'
  print 'else'

The case expression can only contain integral numbers known at compile-time. The compiler can generate more optimized code when using a switch instead of using many if statements for integers.


Do blocks are useful for creating arbitrary scopes to avoid collision of variable names:

  local a = 0
  print(a) -- outputs: 0
  local a = 1 -- can declare variable named a again
  print(a) -- outputs: 1


The defer statement is useful for executing code upon scope termination.

    print 'world'
  print 'hello'
-- outputs 'hello' then 'world'

Defer is meant to be used for releasing resources in a deterministic manner on scope termination. The syntax and functionality is inspired by the similar statement in the Go language. It is guaranteed to be executed in reverse order before any return, break or continue statement.


Gotos are useful to get out of nested loops and jump between lines:

local haserr = true
if haserr then
  goto getout -- get out of the loop
print 'success'
print 'fail'
-- outputs only 'fail'


While is just like in Lua:

local a = 1
while a <= 5 do
  print(a) -- outputs 1 2 3 4 5
  a = a + 1


Repeat also functions as in Lua:

local a = 0
  a = a + 1
  print(a) -- outputs 1 2 3 4 5
  local stop = a == 5
until stop

Note that, like Lua, a variable declared inside a repeat scope is visible inside its condition expression.

Numeric For

Numeric for is like in Lua, meaning it is inclusive of the first and the last elements:

for i = 0,5 do
  -- i is deduced to 'integer'
  print(i) -- outputs 0 1 2 3 4 5

Like in Lua, numeric for loops always evaluate the begin, end, and step expressions just once. The iterate variable type is automatically deduced using the begin and end expressions only.

Exclusive For

The exclusive for is available to create exclusive for loops. They work using comparison operators ~= <= >= < >:

for i=0,<5 do
  print(i) -- outputs 0 1 2 3 4

Stepped For

The last parameter in for syntax is the step. Its counter is always incremented with i = i + step. By default the step is always 1. When using negative steps, a reverse for loop is possible:

for i=5,0,-1 do
  print(i) -- outputs 5 4 3 2 1


The continue statement is used to skip to the next iteration of a loop:

for i=1,10 do
  if i<=5 then
  print(i) -- outputs: 6 7 8 9 10


The break statement is used to immediately exit a loop:

for i=1,10 do
  if i>5 then
  print(i) -- outputs: 1 2 3 4 5

Primitive types

Primitives types are the basic types built into the compiler.


local a: boolean -- variable of type 'boolean' initialized to 'false'
local b = false
local c = true
print(a,b,c) -- outputs: false false true

The boolean is defined as a bool in the generated C code.


Number literals are defined like in Lua:

local a = 1234 -- variable of type 'integer'
local b = 0xff -- variable of type 'integer'
local c = 3.14159 -- variable of type 'number'
local d = 'A'_uint8 -- variable of type 'uint8' set from an ASCII character
local e: integer
print(a,b,c,d,e) -- outputs: 1234 255 3.141590 65 0

The integer is the default type for integral literals without suffix. The number is the default type for fractional literals without suffix.

You can use type suffixes to force a type for a numeric literal:

local a = 1234_u32 -- variable of type 'int32'
local b = 1_f32 -- variable of type 'float32'
print(a,b) --outputs: 1234 1.000000

The following table shows Nelua primitive numeric types and their related types in C:

Type C Type Suffixes
integer int64_t _i _integer
uinteger unt64_t _u _uinteger
number double _n _number
byte uint8_t _b _byte
isize intptr_t _is _isize
int8 int8_t _i8 _int8
int16 int16_t _i16 _int16
int32 int32_t _i32 _int32
int64 int64_t _i64 _int64
int128* __int128 _i128 _int128
usize uintptr_t _us _usize
uint8 uint8_t _u8 _uint8
uint16 uint16_t _u16 _uint16
uint32 uint32_t _u32 _uint32
uint64 uint64_t _u64 _uint64
uint128* unsigned __int128 _u128 _uint128
float32 float _f32 _float32
float64 double _f64 _float64
float128* _Float128 _f128 _float128

* Only supported by some C compilers and architectures.

The types isize and usize are usually 32 bits wide on 32-bit systems, and 64 bits wide on 64-bit systems.

When you need an integer value you should use integer unless you have a specific reason to use a sized or unsigned integer type. The integer, uinteger and number are intended to be configurable. By default they are 64 bits for all architectures, but this can be customized by the user at compile-time via the preprocessor when needed.


There are two types of strings, the string used for strings allocated at runtime, and stringview used for strings literals defined at compile-time as well as for views of runtime strings.

-- to use the 'string' type we must import from the standard library
require 'string'

local mystr: string -- empty string
local str1: string = 'my string' -- variable of type 'string'
local str2 = "static stringview" -- variable of type 'stringview'
local str3: stringview = 'stringview two' -- also a 'stringview'
print(str1, str2, str3) -- outputs: "my string" "static stringview" "stringview two"

The major difference between stringview and string is that stringview doesn’t manage the string memory, i.e. it doesn’t allocate or deallocate strings. The string type is usually allocated at runtime and it frees the string memory once its reference count reaches 0. When the garbage collector is disabled, the stringview uses weak references, thus any stringview pointing to a string is invalidated once the related string is freed. Both types can be converted from one to another.

Like in Lua, string is immutable. This makes the semantics similar to Lua. If the programmer wants a mutable string, implementing a custom string class is easily achievable.


An array is a list with a size that is fixed and known at compile-time:

local a: array(integer, 4) = {1,2,3,4}
print(a[0], a[1], a[2], a[3]) -- outputs: 1 2 3 4

local b: [4]integer -- "[4]integer" is syntax sugar for "array(integer, 4)"
print(b[0], b[1], b[2], b[3]) -- outputs: 0 0 0 0
local len = #b -- get the length of the array, should be 4
print(len) -- outputs: 4

When passing an array to a function as an argument, it is passed by value. This means the array is copied. This can incur some performance overhead. Thus when calling functions, you may want to pass arrays by reference using the reference operator when appropriate.


Enums are used to list constant values in sequential order:

local Weeks = @enum {
  Sunday = 0,
print(Weeks.Sunday) -- outputs: 0

local a: Weeks = Weeks.Monday
print(a) -- outputs: 1

The programmer must always initialize the first enum value. This choice was made to makes the code more clear when reading.


Any is a special type that can store any type at runtime:

local a: any = 2 -- variable of type 'any', holding type 'integer' at runtime
print(a) -- outputs 2
a = false -- now holds the type 'boolean' at runtime
print(a) -- outputs false

The any type makes Nelua semantics compatible to Lua. You can use it to make untyped code just like in Lua, however know that you pay the price in performance, as operations on any types generate lots of branches at runtime, meaning less efficient code.

The any type is poorly supported at the moment, so please avoid using it for now. Usually you don’t want use any types anyway, as they require runtime branching, and are thus less efficient.


Records store variables in a block of memory:

local Person = @record {
  name: string,
  age: integer

-- typed initialization
local a: Person = {name = "Mark", age = 20}
print(a.name, a.age)

-- casting initialization
local b = (@Person){name = "Paul", age = 21}
print(b.name, b.age)

-- ordered fields initialization
local c = (@Person){"Eric", 21}
print(c.name, c.age)

-- late initialization
local d: Person
d.name = "John"
d.age  = 22
print(d.name, d.age)

Records are directly translated to C structs.


A pointer points to a region in memory of a specific type:

local n = nilptr -- a generic pointer, initialized to nilptr
local p: pointer --a generic pointer to anything, initialized to nilptr
local i: pointer(integer) -- pointer to an integer

-- syntax sugar
local i: *integer

Pointers are directly translated to C raw pointers. Unlike C, pointer arithmetic is disallowed. To do pointer arithmetic you must explicitly cast to and from integers.

Unbounded Array

An array with size 0 is an unbounded array, that is, an array with unknown size at compile time:

local a: array(integer, 4) = {1,2,3,4}

-- unbounded array only makes sense when used with pointer
local a_ptr: pointer(array(integer, 0))
a_ptr = &a -- takes the reference of 'a'

An unbounded array is useful for indexing pointers, because unlike C, you cannot index a pointer unless it is a pointer to an unbounded array.

Unbounded arrays are unsafe, because bounds checking is not possible at compile time or runtime. Use the span to have bounds checking.

Function type

The function type, mostly used to store callbacks, is a pointer to a function:

local function add_impl(x: integer, y: integer): integer
  return x + y

local function double_add_impl(x: integer, y: integer): integer
  return 2*(x + y)

local add: function(x: integer, y: integer): integer
add = add_impl
print(add(1,2)) -- outputs 3
add = double_add_impl
print(add(1,2)) -- outputs 6

The function type is just a pointer, and thus can be converted to/from generic pointers.


Span, also known as “fat pointers” or “slices” in other languages, are pointers to a block of contiguous elements of which the size is known at runtime:

require 'span'
local arr = (@[4]integer) {1,2,3,4}
local s: span(integer) = &arr
print(s[0], s[1]) -- outputs: 1 2
print(#s) -- outputs 4

The advantage of using a span instead of a pointer is that spans generate runtime checks for out of bounds access, so oftentimes code using span is safer. The runtime checks can be disabled in release builds.


The niltype is the type of nil.

The niltype is not useful by itself, it is only useful when using with unions to create the optional type or for detecting nil arguments in polymorphic functions.

The “type” type

The type type is the type of a symbol that refers to a type. Symbols with this type are used at compile-time only. They are useful for aliasing types:

local MyInt: type = @integer -- a symbol of type 'type' holding the type 'integer'
local a: MyInt -- variable of type 'MyInt' (actually an 'integer')
print(a) -- outputs: 0

In the middle of statements the @ token is required to precede a type expression. This token signals to the compiler that a type expression comes after it.

Size of a type

You can use the operator # to get the size of any type in bytes:

local Vec2 = @record{x: integer, y: integer}
print(#Vec2) -- outputs: 8

Implicit type conversion

Some types can be implicitly converted. For example, any arithmetic type can be converted to any other arithmetic type:

local i: integer = 1
local u: uinteger = i
print(u) -- outputs: 1

Implicit conversion generates runtime checks for loss of precision in the conversion. If this happens the application crashes with a narrow casting error. The runtime checks can be disabled in release builds.

Explicit type conversion

The expression (@type)(variable) is used to explicitly convert a variable to another type.

local i = 1
local f = (@number)(i) -- convert 'i' to the type 'number'
print(i, f) -- outputs: 1 1.000000

If a type is aliased to a symbol then it is possible to convert variables by calling the symbol:

local MyNumber = @number
local i = 1
local f = MyNumber(i) -- convert 'i' to the type 'number'
print(i, f) -- outputs: 1 1.000000

Unlike implicit conversion, explicit conversions skip runtime checks:

local ni: integer = -1
-- the following would crash with "narrow casting from int64 to uint64 failed"
--local nu: uinteger = ni

local nu: uinteger = (@uinteger)(ni) -- explicit cast works, no checks are done
print(nu) -- outputs: 18446744073709551615


Unary and binary operators are provided for creating expressions:

print(2 ^ 2) -- pow, outputs: 4.000000
print(5 // 2) -- integer division, outputs: 2
print(5 / 2) -- float division, outputs: 2.500000

All Lua operators are provided:

Name Syntax Operation
or a or b conditional or
and a and b conditional and
lt a < b less than
gt a > b greater than
le a <= b less or equal than
ge a >= b greater or equal than
ne a ~= b not equal
eq a == b equal
bor a | b bitwise or
band a & b bitwise and
bxor a ~ b bitwise xor
shl a << b bitwise logical left shift
shr a >> b bitwise logical right shift
asr a >>> b bitwise arithmetic right shift
bnot ~a bitwise not
concat a .. b concatenation
add a + b arithmetic add
sub a - b arithmetic subtract
mul a * b arithmetic multiply
div a / b arithmetic division
idiv a // b arithmetic floor division
tdiv a /// b arithmetic truncate division
mod a % b arithmetic floor division remainder
tmod a %%% b arithmetic truncate division remainder
pow a ^ b arithmetic exponentiation
unm -a arithmetic negation
not not a boolean negation
len #a length
deref $a pointer dereference
ref &a memory reference

All the operators follow Lua semantics, i.e.:

  • / and ^ promotes numbers to floats.
  • // and % rounds the quotient towards minus infinity.
  • /// and %%% rounds the quotient towards zero.
  • Integer overflows wrap around.
  • Bitwise shifts are defined for negative and large shifts.
  • and, or, not, ==, ~= can be used on any variable type.

The additional operators not found in Lua are >>>, ///, %%%, $ and &, used for low level programming.


Functions are declared as in Lua, but arguments and returns can have their types explicitly specified:

local function add(a: integer, b: integer): integer
  return a + b
print(add(1, 2)) -- outputs 3

Return type inference

The return type can be automatically deduced when not specified:

local function add(a: integer, b: integer)
  return a + b -- return is of deduced type 'integer'
print(add(1, 2)) -- outputs 3

Argument type inference

When not specifying a type for an argument, the compiler assumes that the argument is of the any type:

local function get(a)
  -- a is of type 'any'
  return a -- return is of deduced type 'any'
print(get(1)) -- outputs: 1

In contrast with variable declaration, when the type is omitted from a function argument there is no automatic deducing of the argument type. Instead it is assumed the argument must be of the any type. This makes Nelua semantics more compatible with Lua semantics.

Avoid doing this at the moment, and explicitly set types for function arguments, due to the poor support for any type. Omitting the type for the return type is fine because the compiler can deduce it.

Recursive calls

Functions can call themselves recursively:

local function fib(n: integer): integer
  if n < 2 then return n end
  return fib(n - 2) + fib(n - 1)
print(fib(10)) -- outputs: 55

Function that do recursive calls must explicitly set the return type, i.e, the compiler cannot deduce the return type.

Multiple returns

Functions can have multiple return values as in Lua:

local function get_multiple()
  return false, 1

local a, b = get_multiple()
-- a is of type 'boolean' with value 'false'
-- b is of type 'integer' with value '1'
print(a,b) -- outputs: false 1

Multiple returns can optionally be explicitly typed:

local function get_multiple(): (boolean, integer)
  return false, 1

local a, b = get_multiple()
print(a,b) -- outputs: false 1

Multiple returns are efficient and packed into C structs in the code generator.

Top scope closures

Functions declared in the top scope work as top scope closures. They have access to all local variables declared beforehand:

local counter = 1 -- 'a' lives in the heap because it's on the top scope
local function increment() -- foo is a top scope closure
  -- counter is an upvalue for this function, we can access and modify it
  counter = counter + 1
print(counter) -- outputs 1
print(counter) -- outputs 2

Unlike Lua, when declaring functions in the top scope, the compiler takes advantage of the fact that top scope variables are always accessible in the program’s static storage memory to create lightweight closures without needing to hold an upvalue reference or to use a garbage collector. Therefore they are very lightweight and do not incur costs like a closure nested in a function would.

Polymorphic functions

Polymorphic functions, or poly functions for short in the sources, are functions which contain arguments whose proprieties can only be known when calling the function at compile time. They are defined and processed later when calling it for the first time. They are used to specialize the function for different arguments types:

local function add(a: auto, b: auto)
  return a + b

local a = add(1,2)
-- call to 'add', a function 'add(a: integer, b: integer): integer' is defined
print(a) -- outputs: 3
local b = add(1.0, 2.0)
-- call to 'add' with different types, function 'add(a: number, b: number): number' is defined
print(b) -- outputs: 3.000000

In the above, the auto type is used as a generic placeholder to replace the function argument with the incoming call type. This makes it possible to create a generic function for multiple types.

Polymorphic functions are memoized, that is, only defined once for each kind of specialization.

Later we will show how polymorphic functions are more useful when used in combination with the preprocessor.

Record functions

A record type can have functions defined for it. This makes it possible to create functions that are to be used only within the record:

local Vec2 = @record{x: number, y: number}

function Vec2.create(x: integer, y: integer): Vec2
  return (@Vec2){x, y}

local v = Vec2.create(1,2)
print(v.x, v.y) -- outputs: 1 2

Record methods

A method is function defined for record that takes a reference to the record as its first argument. This first argument is visible as self inside the method. For defining or calling a method the colon token : must be used, just like in Lua.

local Rect = @record{x: number, y: number, w: number, h: number}

function Rect:translate(x: number, y: number)
  -- 'self' here is of the type '*Rect'
  self.x = self.x + x
  self.y = self.y + y

function Rect:area()
  -- 'self' here is of the type '*Rect'
  return self.w * self.h

local v = Rect{0,0,2,3}
print(v.x, v.y) -- outputs 2 2
print(v:area()) -- outputs 6

When calling methods on records, the compiler automatically takes care to automatically reference or dereference the object being called.

Record metamethods

Some special methods using the __ prefix are used by the compiler to define behaviors on certain operations with the record type. They are called metamethods and are similar to Lua metamethods:

require 'math'

local Vec2 = @record{x: number, y: number}

-- Called on the binary operator '+'
function Vec2.__add(a: Vec2, b: Vec2)
  return (@Vec2){a.x+b.x, a.y+b.y}

-- Called on the unary operator '#'
function Vec2:__len()
  return math.sqrt(self.x*self.x + self.y*self.y)

local a: Vec2 = {1, 2}
local b: Vec2 = {3, 4}
local c = a + b -- calls the __add metamethod
print(c.x, c.y) -- outputs: 4 6
local len = #c -- calls the __len metamethod
print(len) -- outputs: 7.2

Complete list of metamethods that can be defined for records:

Name Syntax Kind Operation
__lt a < b binary less than
__le a <= b binary less or equal than
__eq a == b binary equal
__bor a | b binary bitwise or
__band a & b binary bitwise and
__bxor a ~ b binary bitwise xor
__shl a << b binary bitwise logical left shift
__shr a >> b binary bitwise logical right shift
__asr a >>> b binary bitwise arithmetic right shift
__bnot ~a unary bitwise not
__concat a .. b binary concatenation
__add a + b binary arithmetic add
__sub a - b binary arithmetic subtract
__mul a * b binary arithmetic multiply
__div a / b binary arithmetic division
__idiv a // b binary arithmetic floor division
__tdiv a /// b binary arithmetic truncate division
__mod a % b binary arithmetic floor division remainder
__tmod a %%% b binary arithmetic truncate division remainder
__pow a ^ b binary arithmetic exponentiation
__unm -a unary arithmetic negation
__len #a unary length
__index a[b] indexing array index
__atindex a[b] indexing array index via reference
__tocstring   cast implicit cast to cstring
__tostring   cast implicit cast to string
__tostringview   cast implicit cast to stringview
__convert   cast implicit cast from anything

Record globals

Sometimes it is useful to declare a global variable inside a record type, using the record as a “namespace”:

global Globals = @record{} -- record used just for name spacing
global Globals.AppName: stringview
Globals.AppName = "My App"
print(Globals.AppName) -- outputs: My App

Record globals can be used to encapsulate modules, like tables are used to make modules in Lua.

Calls with nested records

You can define and later initialize complex records structures in a Lua-like style:

local WindowConfig = @record{
  title: stringview,
  pos: record {
    x: integer,
    y: integer
  size: record {
    x: integer,
    y: integer
local function create_window(config: WindowConfig)
  print(config.title, config.pos.x, config.pos.y)

-- the compiler knows that the argument should be parsed as WindowConfig
-- notice that 'size' field is not set, so its initialized to zeros
create_window({title="hi", pos={x=1, y=2}})

Memory management

By default Nelua uses a garbage collector to allocate and deallocate memory on its own. However, it can be disabled with the pragma nogc via the command line using -P nogc or in the sources:

## pragmas.nogc = true -- tells the compiler that we don't want to use the GC
require 'string' -- the string class will be implemented without GC code
local str = tostring(1) -- tostring needs to allocates a new string
print(str) -- outputs: 1
## if pragmas.nogc then -- the GC is disabled, must manually deallocate memory
str:destroy() -- deallocates the string
## end
print(str) -- the string was destroyed and is now empty, outputs nothing

Notice that when disabling the garbage collector the coding style is different from the usual Lua style, since you now need to think of each allocation and deallocation, including strings, otherwise memory in your application will leak. Thus it is best to leave the GC enabled when you need rapid prototyping.

Disable the GC if you want to control the memory on your own for performance reasons, if you know how to deal with memory management and don’t mind the additional cognitive load when coding.

Allocating memory with the GC

Nelua provides many allocators to assist in managing memory. The most important ones are the allocators.general and allocators.gc.

When using the GC you should allocate using the gc_allocator:

require 'string'
require 'memory'
require 'allocators.gc'

local Person = @record{name: string, age: integer}
local p: *Person = gc_allocator:new(@Person)
p.name = "John"
p.age = 20
print(p.name, p.age)
p = nilptr
-- we don't need to deallocate, the GC will do this on its own when needed!

When the GC is enabled, you must always allocate memory that contains pointers using gc_allocator instead of other allocators, because it marks the allocated memory region for scanning for references.

Allocating memory manually

For doing manual memory management, you can use the general purpose allocator, which is based on the system’s malloc and free functions:

## pragmas.nogc = true -- disables the GC
require 'string'
require 'memory'
require 'allocators.general'

local Person = @record{name: string, age: integer}
local p: *Person = general_allocator:new(@Person) -- allocate the appropriate size for Person
p.name = tostring("John") -- another allocation here
p.age = 20
print(p.name, p.age)
p.name:destroy() -- free the string allocation
general_allocator:delete(p) -- free the Person allocation
p = nilptr

Dereferencing and referencing

The operator & is used to get a reference to a variable, and the operator $ is used to access the reference.

local a = 1
local ap = &a -- ap is a pointer to a
$ap = 2
print(a) -- outputs 2
a = 3
print($ap) -- outputs 3

Automatic referencing and dereferencing

The compiler can perform automatic referencing or dereferencing for records and arrays:

local Person = @record{name: stringview, age: integer}
local p: Person = {"John", 20}
local p_ref: *Person = p -- the referencing with `&` is implicit here
local p_copy: Person = p_ref -- the dereferencing with `$` is implicit here

For instance, the above code is equivalent to:

local Person = @record{name: stringview, age: integer}
local p: Person = {"John", 20}
local p_ref: *Person = &p
local p_copy: Person = $p_ref

The above example is not very useful by itself, but permits auto referencing when doing method calls:

local Person = @record{name: stringview, age: integer}

-- note that this function only accept pointers
function Person.print_info(self: *Person)
  print(self.name, self.age)

local p: Person = {"John", 20}
p:print_info() -- perform auto referencing of 'p' when calling here
Person.print_info(p) -- equivalent, also performs auto referencing

The automatic referencing and dereferencing mechanism allows the use of any unary operator, binary operator, function call, or method call by value or by reference, for records or arrays.

Meta programming

The language offers advanced features for metaprogramming by having a full Lua preprocessor at compile time that can generate and manipulate code when compiling.


At compile time a Lua preprocessor is available to render arbitrary code. It works similarly to templates in the web development world, because it emits code written between its statements.

Lines beginning with ## and between ##[[ ]] are Lua code evaluated by the preprocessor:

local a = 0
## for i = 1,4 do
  a = a + 1 -- unroll this line 4 times
## end
print(a) -- outputs 4

local something = false
if something then
  print('hello') -- prints hello when compiling with "something" defined
##[[ end ]]

For instance, the above code compiles exactly as:

local a = 0
a = a + 1
a = a + 1
a = a + 1
a = a + 1

Using the Lua preprocessor, you can generate complex code at compile-time.

Emitting AST nodes (statements)

It is possible to manually emit AST nodes for statements while preprocessing:

-- create a macro that injects a custom node when called
local function print_macro(str)
  local node = aster.Call{{aster.String{"hello"}}, aster.Id{"print"}}
  -- inject the node where this macro is being called from

## print_macro('hello')

The above code compiles exactly as:


For a complete list of AST shapes that can be created using the aster module read the AST definitions file or the syntax definitions spec file for examples.

Emitting AST nodes (expressions)

It is possible to manually emit AST nodes for expressions while preprocessing:

local a = #[aster.Number{'dec','1'}]#
print(a) -- outputs: 1

The above code compiles exactly as:

local a = 1
print(a) -- outputs: 1

Expression replacement

For placing values generated by the preprocessor you can use #[ ]#:

local deg2rad = #[math.pi/180.0]#
local hello = #['hello' .. 'world']#
local mybool = #[false]#
print(deg2rad, hello, mybool) -- outputs: 0.017453 helloworld false

The above code compiles exactly as:

local deg2rad = 0.017453292519943
local hello = 'helloworld'
local mybool = false
print(deg2rad, hello, mybool)

Name replacement

For placing identifier names generated by the preprocessor you can use #| |#:

local #|'my' .. 'var'|# = 1
print(myvar) -- outputs: 1

local function foo1() print 'foo' end
#|'foo' .. 1|#() -- outputs: foo

local Weekends = @enum { Friday=0, Saturday, Sunday }

The above code compiles exactly as:

local myvar = 1

local function foo1() print 'foo' end

local Weekends = @enum { Friday=0, Saturday, Sunday }

Preprocessor templated macros

A macros can be created by declaring a function in the preprocessor with its body containing normal code:

## function increment(a, amount)
  -- 'a' in the preprocessor context is a symbol, we need to use its name
  -- 'amount' in the preprocessor context is a lua number
  #|a.name|# = #|a.name|# + #[amount]#
## end
local x = 0
## increment(x, 4)

The above code compile exactly as:

local x = 0
x = x + 4

Preprocessor macros emitting AST nodes

Creating macros using the template rendering mechanism in the previous example is handy, but has limitations and is not flexible enough for all cases. For example, suppose you want to create an arbitrarily sized array. In this case you will need to manually emit AST nodes:

-- Create a fixed array initializing to 1,2,3,4...n
local function create_sequence(attr_or_type, n)
  local type
  if traits.is_type(attr_or_type) then -- already a type
    type = attr_or_type
  elseif traits.is_attr(attr_or_type) then -- get a type from a symbol
    type = attr_or_type.value
  -- check if the inputs are valid, in case wrong input
  static_assert(traits.is_type(type), 'expected a type or a symbol to a type')
  static_assert(traits.is_number(n) and n > 0, 'expected n > 0')
  -- create list of expression
  local exprs = {}
  for i=1,n do
    -- aster.value convert any Lua value to the proper ASTNode
    exprs[i] = aster.value(i)
  -- create the Table ASTNode, it's used for any braces {} expression
  return aster.Table{exprs, pattr = {
    -- hint the compiler what type this braces should be evaluated
    desiredtype = types.ArrayType(type, #exprs)}

local a = #[create_sequence(integer, 10)]#

The above code compiles exactly as:

local a = (@[10]integer){1,2,3,4,5,6,7,8,9,10}

Code blocks as arguments to preprocessor functions

Blocks of code can be passed to macros by surrounding them inside a function:

function unroll(count, block)
  for i=1,count do

local counter = 1
## unroll(4, function()
  print(counter) -- outputs: 1 2 3 4
  counter = counter + 1
## end)

The above code compiles exactly as:

local counter = 1
counter = counter + 1
counter = counter + 1
counter = counter + 1
counter = counter + 1

Generic code via the preprocessor

Using macros it is possible to create generic code:

## function Point(PointT, T)
  local #|PointT|# = @record { x: #|T|#, y: #|T|# }
  function #|PointT|#:squaredlength()
    return self.x*self.x + self.y*self.y
## end

## Point('PointFloat', 'float64')
## Point('PointInt', 'int64')

local pa: PointFloat = {x=1,y=2}
print(pa:squaredlength()) -- outputs: 5

local pb: PointInt = {x=1,y=2}
print(pb:squaredlength()) -- outputs: 5.000000

Preprocessing on the fly

While the compiler is processing you can view what the compiler already knows to generate arbitrary code:

local Weekends = @enum { Friday=0, Saturday, Sunda }
## for i,field in ipairs(Weekends.value.fields) do
  print(#[field.name .. ' ' .. tostring(field.value)]#)
## end

The above code compiles exactly as:

local Weekends = @enum { Friday=0, Saturday, Sunday }
print 'Friday 0'
print 'Saturday 1'
print 'Sunday 2'

You can even manipulate what has already been processed:

local Person = @record{name: string}
## Person.value:add_field('age', primtypes.integer) -- add field 'age' to 'Person'
local p: Person = {name='Joe', age=21}
print(p.age) -- outputs '21'

The above code compiles exactly as:

local Person = @record{name: string, age: integer}
local p: Person = {name='Joe', age=21}
print(p.age) -- outputs '21'

The compiler is implemented and runs using Lua, and the preprocessor is actually a Lua function that the compiler is running, so it is even possible to modify or inject code into the compiler itself on the fly.

Preprocessing polymorphic functions

Polymorphic functions can be specialized at compile time when used in combination with the preprocessor:

local function pow(x: auto, n: integer)
## static_assert(x.type.is_arithmetic, 'cannot pow variable of type "%s"', x.type)
## if x.type.is_integral then
  -- x is an integral type (any unsigned/signed integer)
  local r: #[x.type]# = 1
  for i=1,n do
    r = r * x
  return r
## elseif x.type.is_float then
  -- x is a floating point type
  return x ^ n
## end

local a = pow(2, 2) -- use specialized implementation for integers
local b = pow(2.0, 2) -- use pow implementation for floats
print(a,b) -- outputs: 4 4.000000

-- uncommenting the following will trigger the compile error:
--   error: cannot pow variable of type "string"
--pow('a', 2)

Preprocessor code blocks

Arbitrary Lua code can be put inside preprocessor code blocks. Their syntax starts with ##[[ or ##[=[ (any number of = tokens you want between the brackets) and ends with ]] or ]=] (matching the number of = tokens previously used):

-- this is a preprocessor code block
function my_compiletime_function(str)
  print(str) -- print at compile time

-- call the function defined in the block above
## my_compiletime_function('hello from preprocessor')

As shown in the last line, functions defined inside of the preprocessor code blocks can be evaluated arbitrarily from any part of the code, at any point, using ##.

Although said block was defined for a single module, it will be available for all modules required afterwards, because declarations default to the global scope in Lua. If you would like to avoid polluting other module’s preprocessor environments, declare its functions as local.

Preprocessor function modularity

Suppose you want to use the same preprocessor function from multiple Nelua modules. As explained in the preprocessor code blocks section, one idea is to declare everything in that block as global so that everything would also be available in the preprocessor evaluation of other modules.

For example, in module_A.nelua:

-- this function is declared as global, so it'll be available on module_B.nelua
function foo()
  print "bar"

Then, in module_B.nelua:

require 'module_A'
-- even though foo is not declared in this file, since it's global, it'll be available here
## foo()

Although this seems harmless, it can get messy if you define a function with the same name in different modules. It also means you’re relying on global scope semantics from the preprocessor, which might be unpredictable or brittle due to evaluation order.

Fortunately, there’s a more modular approach for code reuse which does not rely on global scope. Simply create a standalone Lua module and require it on all Nelua modules where you want to use it.

The previous example would be refactored as follows:

1. Create a foo.lua (or any name you want) file and paste your code there:

local function bar()
  print "bar"

return { bar = bar }

2. Then, in any source codes that uses that module:

## local foo = require "foo"

## foo.bar()

Aside from modularity, this has the benefit of your preprocessor code being simply Lua code which can leverage all of your editor’s tooling and configuration, such as a code formatter, syntax highlighter, completions, etc.

If the Lua module is not in the same directory where the compiler is running from, then require will fail to find it. To solve this you can set your system’s LUA_PATH environment variable to a pattern which matches that directory, for example, executing export LUA_PATH="/myprojects/mymodules/?.lua" in your terminal (notice the ?.lua at the end).

Preprocessor utilities

The preprocessor comes with some pre-defined functions to assist metaprogramming.


Used to throw compile-time errors:

-- check the current Lua version in the preprocessor
if _VERSION ~= 'Lua 5.4' then
  static_error('not using Lua 5.4, got %s', _VERSION)


Used to throw compile-time assertions:

-- check the current Lua version in the preprocessor
## static_assert(_VERSION == 'Lua 5.4', 'not using Lua 5.4, got %s', _VERSION)


A generic is a special type created using a preprocessor function that is evaluated at compile time to generate a specialized type based on compile-time arguments. To do this the generalize macro is used. It is hard to explain in words, so take a look at this full example:

-- Define a generic type for creating a specialized FixedStackArray
## local make_FixedStackArray = generalize(function(T, maxsize)
  -- alias compile-time parameters visible in the preprocessor to local symbols
  local T = #[T]#
  local MaxSize <comptime> = #[maxsize]#

  -- Define a record using T and MaxSize compile-time parameters.
  local FixedStackArrayT = @record {
    data: [MaxSize]T,
    size: isize

  -- Push a value into the stack array.
  function FixedStackArrayT:push(v: T)
    if self.size >= MaxSize then error('stack overflow') end
    self.data[self.size] = v
    self.size = self.size + 1

  -- Pop a value from the stack array.
  function FixedStackArrayT:pop(): T
    if self.size == 0 then error('stack underflow') end
    self.size = self.size - 1
    return self.data[self.size]

  -- Return the length of the stack array.
  function FixedStackArrayT:__len(): isize
    return self.size

  -- return the new defined type to the compiler
  ## return FixedStackArrayT
## end)

-- define FixedStackArray generic type in the scope
local FixedStackArray: type = #[make_FixedStackArray]#

do -- test with 'integer' type
  local v: FixedStackArray(integer, 3)

  -- push elements
  -- uncommenting would trigger a stack overflow error:
  -- v:push(4)

  -- check the stack array length
  assert(#v == 3)

  -- pop elements checking the values
  assert(v:pop() == 3)
  assert(v:pop() == 2)
  assert(v:pop() == 1)
  -- uncommenting would trigger a stack underflow error:
  -- v:pop()

do -- test with 'number' type
  local v: FixedStackArray(number, 3)

  -- push elements
  -- uncommenting would trigger a stack overflow error:
  -- v:push(4.5)

  -- check the stack array length
  assert(#v == 3)

  -- pop elements checking the values
  assert(v:pop() == 3.5)
  assert(v:pop() == 2.5)
  assert(v:pop() == 1.5)
  -- uncommenting would trigger a stack underflow error:
  -- v:pop()

Generics are powerful for specializing efficient code at compile time based on different compile-time arguments. They are used in many places in the standard library to, for example, create the vector sequence and span classes. Generics are similar to C++ templates.

Generics are memoized, that is, they are evaluated and defined just once for the same compile-time arguments.


Concepts are a powerful system used to specialize polymorphic functions with efficiency at compile-time.

An argument of a polymorphic function can use the special concept type defined by a preprocessor function that, when evaluated at compile time, decides whether the incoming variable type matches the concept requirements.

To create a concept, use the preprocessor function concept:

local an_arithmetic = #[concept(function(attr)
  -- the first argument of the concept function is an Attr,
  -- attr are stores different attributes for the incoming symbol, variable or node,
  -- we want to check if the incoming attr type matches the concept
  if attr.type.is_arithmetic then
    -- the attr is an arithmetic type (can add, subtract, etc)
    return true
  -- the attr type does not match this concept
  return false

local function add(x: an_arithmetic, y: an_arithmetic)
  return x + y

print(add(1, 2)) -- outputs 3

-- uncommenting the following will trigger the compile error:
--   type 'boolean' could not match concept 'an_arithmetic'
-- add(1, true)

When the concepts of a function are matched for the first time, a specialized function is defined just for those incoming types, thus the compiler generates different functions in C code for each different match. This means that the code is specialized for each type and is handled efficiently because the code does not need to do runtime type branching (the type branching is only done at compile time).

The property type.is_arithmetic is used here to check the incoming type. All the properties defined by the compiler to check the incoming types can be seen here.

Specializing with concepts

A concept can match multiple types, thus it is possible to specialize a polymorphic function further using a concept:

require 'string'

local an_arithmetic_or_string = #[concept(function(attr)
  if attr.type.is_stringy then
    -- we accept strings
    return true
  elseif attr.type.is_arithmetic then
    -- we accept arithmetics
    return true
  return false

local function add(x: an_arithmetic_or_string,
                   y: an_arithmetic_or_string)
  ## if x.type.is_stringy and y.type.is_stringy then
    return x .. y
  ## else
    return x + y
  ## end

-- add will be specialized for arithmetic types
print(add(1, 2)) -- outputs 3
-- add will be specialized for string types
print(add('1', '2')) -- outputs 12

The compiler only defines new different specialized functions as needed, i.e. specialized functions for different argument types are memoized.

Specializing concepts for records

Sometimes you may want to check whether a record matches a concept. To do this you can set a field on its type to later check in the concept, plus you can use it in the preprocessor to assist in specializing code:

local Vec2 = @record{x: number, y: number}
-- Vec2 is an attr of the "type" type, Vec2.value is it's holded type
-- we set here is_Vec2 at compile-time to use later for checking whether a attr is a Vec2
## Vec2.value.is_Vec2 = true

local Vec2_or_arithmetic_concept = #[concept(function(attr)
  -- match in case of arithmetic or Vec2
  return attr.type.is_arithmetic or attr.type.is_Vec2

-- we use a concepts on the metamethod __add to allow adding Vec2 with numbers
function Vec2.__add(a: Vec2_or_arithmetic_concept, b: Vec2_or_arithmetic_concept)
  -- specialize the function at compile-time based on the argument type
  ## if a.type.is_Vec2 and b.type.is_Vec2 then
    return (@Vec2){a.x + b.x, a.y + b.y}
  ## elseif a.type.is_Vec2 then
    return (@Vec2){a.x + b, a.y + b}
  ## elseif b.type.is_Vec2  then
    return (@Vec2){a + b.x, a + b.y}
  ## end

local a: Vec2 = {1, 2}
local v: Vec2
v = a + 1 -- Vec2 + arithmetic
print(v.x, v.y) -- outputs: 2 3
v = 1 + a -- arithmetic + Vec2
print(v.x, v.y) -- outputs: 2 3
v = a + a -- Vec2 + Vec2
print(v.x, v.y) -- outputs: 2 4

Concepts with logic

You can put some logic in your concept to check for any kind of proprieties that the incoming attr should satisfy, and to return compile-time errors explaining why the concept didn’t match:

-- Concept to check whether a type is indexable.
local indexable_concept = #[concept(function(attr)
  local type = attr.type
  if type.is_pointer then -- accept pointer to containers
    type = type.subtype
  -- we accept arrays
  if type.is_array then
    return true
  -- we expect a record
  if not type.is_record then
    return false, 'the container is not a record'
  -- the record must have a __index metamethod
  if not type.metafields.__index then
    return false, 'the container must have the __index metamethod'
  -- the record must have a __len metamethod
  if not type.metafields.__len then
    return false, 'the container must have the __len metamethod'
  -- concept matched all the imposed requirements
  return true

-- Sum all elements of any container with index beginning at 0.
local function sum_container(container: indexable_concept)
  local v: integer = 0
  for i=0,<#container do
    v = v + container[i]
  return v

-- We create our customized array type.
local MyArray = @record {data: [10]integer}
function MyArray:__index(i: integer)
  return self.data[i]
function MyArray:__len()
  return #self.data

local a: [10]integer = {1,2,3,4,5,6,7,8,9,10}
local b: MyArray = {data = a}

-- sum_container can be called with 'a' because it matches the concept
-- we pass as reference using & here to avoid an unnecessary copy
print(sum_container(&a)) -- outputs: 55

-- sum_container can also be called with 'b' because it matches the concept
-- we pass as reference using & here to avoid an unnecessary copy
print(sum_container(&b)) -- outputs: 55

Concept that infers to another type

Sometimes is useful to infer a concept to a different type from the incoming attr. For example, suppose you want to specialize a function that optionally accepts any kind of arithmetic, but you really want it to be implemented as an number:

local facultative_number_concept = #[concept(function(attr)
  if attr.type.is_niltype then
    -- niltype is the type when the argument is missing or when we use 'nil'
    -- we accept it because the number is facultative
    return true
  -- instead of returning true, we return the desired type to be implemented,
  -- the compiler will take care to implicit cast the incoming attr to the desired type,
  -- or throw an error if not possible,
  -- here we want to force the function using this concept to implement as a 'number'
  return primtypes.number

local function get_number(x: facultative_number_concept)
  ## if x.type.is_niltype then
    return 0
  ## else
    return x
  ## end

print(get_number(nil)) -- prints 0
print(get_number(2)) -- prints 2

Facultative concept

Facultative concepts are commonly used, thus there is a shortcut for creating them. For instance, the previous code is equivalent to this:

local function get_number(x: facultative(number))
  ## if x.type.is_niltype then
    return 0
  ## else
    return x
  ## end

print(get_number(nil)) -- prints 0
print(get_number(2)) -- prints 2

Use this when you want to specialize optional arguments at compile-time without any runtime costs.

Overload concept

Using concepts to overload functions for different incoming types at compile time is a common use, so there is also a shortcut for creating overload concepts:

local function foo(x: overload(integer,stringview,niltype))
  ## if x.type.is_integral then
    print('got integer ', x)
  ## elseif x.type.is_stringview then
    print('got string ', x)
  ## else
    print('got nothing')
  ## end

foo(2) -- outputs: got integer 2
foo('hello') -- outputs: got string hello
foo(nil) -- outputs: got nothing

Use this when you want to specialize different argument types at compile time without runtime costs.


Annotations are used to prompt the compiler to behave differently during code generation.

Function annotations

local function sum(a: integer, b: integer) <inline> -- C inline function
  return a + b
print(sum(1,2)) -- outputs: 3

Variable annotations

local a: integer <noinit>-- don't initialize variable to zero
a = 0 -- manually initialize to zero
print(a) -- outputs: 0

local b <volatile> = 1 -- C volatile variable
print(b) -- outputs: 1

C interoperability

Nelua provides many utilities to interoperate with C code.

Importing C functions

To import a C function you must use the <cimport> annotation:

-- import "puts" from C library
local function puts(s: cstring <const>): cint <cimport>
  -- cannot have any code here, because this function is imported

puts('hello') -- outputs: hello

The above code generates exactly this C code:

int puts(const char* s);
static char __strlit1[6] = "hello";
int nelua_main() {
  return 0;

Notice that the puts function is declared automatically, i.e., there is no need to include the header that declares the function.

Importing C functions declared in headers

Sometimes you need to import a C function that is declared in a C header, specially if it is declared as a macro:

-- `nodecl` is used because this function doesn't need to be declared by Nelua,
-- as it will be declared in <stdio.h> header
-- `cinclude` is used to make the compiler include the header when using the function
local function puts(s: cstring <const>): cint <cimport, nodecl, cinclude '<stdio.h>'>

puts('hello') -- outputs: hello

The above code generates exactly this C code:

#include <stdio.h>
static char __strlit1[6] = "hello";
int nelua_main() {
  return 0;

Notice that the nodecl is needed when importing any C function that is declared in a C header, otherwise the function will have duplicate declarations.

Including C files with defines

Sometimes you need to include a C file while defining something before the include:

-- link SDL2 library
## linklib 'SDL2'
-- define SDL_MAIN_HANDLED before including SDL2
## cdefine 'SDL_MAIN_HANDLED'
-- include SDL2 header
## cinclude '<SDL2/SDL.h>'

-- import some constants defined in SDL2 header
local SDL_INIT_VIDEO: uint32 <cimport, nodecl>

-- import functions defined in SDL2 header
local function SDL_Init(flags: uint32): int32 <cimport, nodecl> end
local function SDL_Quit() <cimport, nodecl> end


Importing C functions using a different name

The <cimport> annotation uses the same name as its symbol name, but it is possible to import the function under a different name:

-- we pass the C function name as a parameter for `cimport`
local function c_puts(s: cstring): cint <cimport 'puts', nodecl, cinclude '<stdio.h>'>

c_puts('hello') -- outputs: hello

Linking a C library

When importing a function from a C library you also need to link the library, to do this use the linklib function in the preprocessor:

-- link the SDL2 library when compiling
## linklib 'SDL2'

local function SDL_GetPlatform(): cstring <cimport> end

print(SDL_GetPlatform()) -- outputs your platform name (Linux, Windows, ...)

Notice that we didn’t need to include the SDL header in the above example, we could, but we let Nelua declare the function.

Passing C flags

It is possible to add custom C flags when compiling via the preprocessor:

if FAST then -- release build
  cflags '-Ofast' -- C  compiler flags
  ldflags '-s' -- linker flags
else -- debug build
  cflags '-Og'

If we run the above example with nelua -DFAST example.nelua the C compiler will compile with the cflags -Ofast otherwise -Og.

Emitting raw C code

Sometimes to do low level things in C, or to avoid Nelua’s default semantics, you may want to emit raw C code:

local function do_stuff()
  -- emits in the declarations section of the generated C file
  ## cemitdecl '#include <stdio.h>'

  -- emits inside this function in the generated C file
  ##[==[ cemit([[
    const char *msg = "hello from C\n";


Nelua can emit C code in 3 different sections, in the global declarations section using cemitdecl, in the global definitions section using cemitdef or the current scope section using cemit. Usually, you want to use cemit.

Exporting named C functions

You can use Nelua to create C libraries. When doing this, you may want to fix the name of the generated C function and export it:

-- `cexport` marks this function to be exported
-- `codename` fix the generated C code name
local function foo() <cexport, codename 'mylib_foo'>
  return 1

The above code generates exactly this C code:

extern int64_t mylib_foo();
int64_t mylib_foo() {
  return 1;

C primitives

For importing C functions, additional primitive types are provided for compatibility:

Type C Type Suffixes
cshort short _cshort
cint int _cint
clong long _clong
clonglong long long _clonglong
cptrdiff ptrdiff_t _cptrdiff
cchar char _cchar
cschar signed char _cschar
cuchar unsigned char _cuchar
cushort unsigned short _cushort
cuint unsigned int _cuint
culong unsigned long _culong
culonglong unsigned long long _culonglong
csize size_t _csize
clongdouble long double _clongdouble
cstring char* _cstring

Use these types for importing C functions only. For normal code, use the other Nelua primitive types.

Libraries »