Categories
AI NTai Spring

Lua & AIs Solution 1: Extension Modules

There is an issue in the spring AI scene right now regarding the onslaught of lua Game mechanics and the disconnect between those new rules and the AIs. AIs are unable to play the new game mechanics because they are unaware of their existence.

THIS v4 courtesy of KDR

Image of ‘THIS WIP’

Until recently it was not possible for AIs to communicate with Lua gadgets, and now that the interfaces for lua <-> AI communication have been added, there is no documentation on either the AI or the lua end to explain how these call outs would be used or how they work. No examples, no explanations, no demos, no hints.

However once that’s been sorted out there’s still the issue of how AI and lua should co-ordinate. To this end I have several solutions, which I will outline in these blog posts.

First Solution, Modular Extensions

Extensions modules are a weird name I thought of just now to explain what I envisage as my first solution. To understand how I intend to implement this we must first look at how NTai works.

In NTai, the vast majority of object are inherited in some way from a modular class, to which a pointer to a main class is given and messages are passed to. This acts as a basic framework for handling messaging and events during an objects lifetime. These objects may represent units, unit tasks, or global overseers of a particular task such as attack co-ordination.

How does this affect lua <-> AI?

Its the intention here that a class be made inheriting from the interface given several parameters. This class will then act as a proxy across the Lua <-> AI divide and pass messages to and from code written on the lua end of the divide.

How Would It Be Done?

A new lua based configuration system would be created, possibly written as a lua gadget or invoking the need of a built in lua interpreter. This new configuration file would have the necessary details to request the necessary lua objects be invoked. Lua coders or configuration builders could create brand new tasks/keywords for their task lists, or create additional objects that act on a global level rather than as a task just as scouting and attacking worked in NTai.

This would allow gadgets to expose additional commands to configuration builders to add to NTai, or configuration builders could add in their own custom logic. The AI could be integrated further into gadgets should the gadget authors want to.

Are There Any Other Requirements?

This would require a lua based object sitting on the other end of the divide seperating out the messages NTai hands it and passing them on to the individual lua objects. Most likely a gadget of some sorts.

What Are The Downsides?

However due to insufficient documentation, not enough is known about the lua <-> AI interface to build this.

There is also the issue that although these objects are generic, their context is not, and thus only global objects could be supported by other AIs. The majority of the design is assuming that the AI on the AI side of the divide is based on the NTai code base or implements atomic task based unit control.

Next Post: Solution 2

The next solution is something that Im sure other AI developers have asked for but not compeltely thought through, compeltely generic, and supportable by all AIs, but the workload is enormous and could serve as a research example that lasts decades.

Leave a Reply

Your email address will not be published. Required fields are marked *