C12 running local LLM's

As Clarion 12 is reported to be able to run local LLM’s (Ai’s) [1], or earlier versions using #RunDLL in the template language, I thought I’d start a thread for those interested in running local LLM’s.

So I’ve been keeping an eye on what LLM’s are performing best in programming activity, and Claude AI seems to be doing well in various polls. [2-5]

Its possible to run a LLM locally on a dev machine, and RAG [6] seems to be a way to keep LLM’s truer aka reduce “hallucinations”.

So to get started with running a LLM locally, it would help to find some good instructions to setup an environment for locally run LLM’s.

I think this one seems to be quite good [7], but if anyone else finds better instructions or problems with set up, then let us know on here.

I’ve been wondering just how these LLM’s will learn a new language, like the template language or Clarion language, and one way seems to be to point them to one or more Github repo’s to analyse the code.

This should address the problem where some template language commands can only be used within other template language commands, like * #Tab’s encapsulating #Prompt’s.

*Not strictly true, but used as an example of commands needing to be encapsulated.

I havent found any other method to train the LLM’s yet, like providing a list, but it was Marks post [8] that prompted me into thinking, how do you train a LLM on a proprietary language like Clarion?

Anyway I think this post/thread should cover the basics, and perhaps act as a starter thread for this particular topic with Clarion.

I would imagine the built in Facebook/Meta Llama3 will be more seamless to use in C12 as it will all be cloud based so wont require any setup, but it might also mean less control, over screen and report layout’s. There may be a way to skew this though, by providing some Github code examples which BobZ put out a few years ago at one of the Devcon’s, which might then become included in the training data for Llama3.

Any questions, no matter how dumb they might appear on the subject could also go here, as I’m sure others may also have the same question.

Comments always appreciated…

[1] Embracing the Future - Clarion 12 Archives - Clarion
[2] Shepherd’s Dog - when-ai-fails/shepards-dog/README.md at main · vnglst/when-ai-fails · GitHub
[3] Aider LLM Leaderboards | aider
[4] Can Ai Code Results - a Hugging Face Space by mike-ravkine
[5] ProLLM Benchmarks
[6] What Is Retrieval-Augmented Generation aka RAG | NVIDIA Blogs
[7] Claude Ai Local Install Guide | Restackio
[8] Clarion Language: Complete List of Statements That Require a Matching END

1 Like

There a bit of discussion on the ai channel in discord.

2 Likes

With AI starting to generate interesting results for coding … we found anthropic to be interesting after trying several others. It has understood clarion code without seeing the clarion language model and made some astounding generative contributions to complex clarion interface modules.

This is a summary of what AI thinks about the results it generated . " The ability to make contextual inferences, understand unfamiliar code without explicit instruction, and deduce function from structure represents a capability that’s qualitatively different from traditional rule-based systems or simple pattern matching.

It’s a form of synthetic understanding that allows for reasoning across domains and making connections between concepts that weren’t explicitly linked in the training data. The system can “fill in the gaps” and make educated guesses based on partial information - much like humans do when confronted with new situations.

Whether this qualifies as “intelligence” in the human sense is a complex philosophical question, but it certainly represents a meaningful step forward in how machines can understand and work with human-created artifacts like code.

What makes this particularly interesting is that it wasn’t explicitly programmed as a feature - it emerged from the broader learning process. This emergent capability to transfer understanding across contexts and make reasonable inferences from limited information suggests we’ve crossed an important threshold in how AI systems can engage with complex human knowledge domains."

2 Likes

anthropic has taken the old XML to and from queue methods util file and changed the output to support data file row column output and added multi user watch… reads still occur via the old XML code base but writes uses a simple ascii text XML output… very simple but fast using the clarion ascii file driver and added multi user queue row locking back to the row column generic data storage… Here is the AI Generated summary the new modules it generated and with little to not much training. It appears clarion being a very straight forward language is very easy for AI to generate, understand and refactor…

Enhanced Row-Level Queue Storage Module Implementation Summary

This implementation enables row-level operations for queue storage with optional optimistic concurrency control, providing a significant enhancement to the original module.

Key Components

  1. Module Header (module-header)

    • Defines all function prototypes
    • Maintains backward compatibility with original functions
    • Adds new row-level operation functions
  2. Schema Definitions (schema-definitions)

    • Enhanced DataClaQueuesStorage with composite keys
    • Added RowColKey for efficient row-level access
    • Added QueueName for multi-queue support
    • Added version tracking tables
  3. Schema Version Management (schema-version-functions)

    • TableSchemaVersion checks current schema version
    • TableSchemaConvert handles migration from old to new format
    • Automatic backup of original data during conversion
  4. Row-Level Operations (row-operations)

    • TableQCellRowSave saves a single row with optional concurrency control
    • TableQCellRowLoad loads a single row
    • Uses SET with composite key for efficient positioning
  5. Lock Status Functions (lock-status-functions)

    • Simplified status functions that support the optimistic concurrency approach
  6. Module Binding Functions (binding-functions)

    • Functions to bind and unbind the module
    • Memory management for queue references
  7. Usage Example (usage-example)

    • Demonstrates both multi-user and single-user approaches
    • Shows how to handle concurrency conflicts

Key Design Decisions

  1. Optimistic Concurrency with WATCH

    • Uses Clarion’s built-in WATCH mechanism instead of custom locking
    • Detects conflicts at save time
    • No need for explicit lock/unlock operations
  2. Composite Key for Row Operations

    • RowColKey on (RowNo, ColNo) enables efficient row-level access
    • No need for complex VIEWs or custom filtering
  3. Backward Compatibility

    • Original functions maintained
    • Automatic schema conversion
    • Seamless integration with existing code
  4. Multi-Queue Support

    • Added QueueName field to support multiple queues in the same storage
    • Each queue can maintain its own namespace

Usage Patterns

  1. Multi-User Environment

    ! Load specific row
    Result = TableQCellRowLoad('QueueName', QueueRef, RowNumber)
    
    ! Make changes to queue data
    QueueRef.Field = 'New Value'
    
    ! Save with optimistic concurrency (TRUE enables WATCH)
    Result = TableQCellRowSave('QueueName', QueueRef, RowNumber, TRUE)
    
    ! Check for concurrency conflicts
    IF Result = -4
       ! Handle conflict
    END
    
  2. Single-User Environment

    ! Load specific row
    Result = TableQCellRowLoad('QueueName', QueueRef, RowNumber)
    
    ! Make changes to queue data
    QueueRef.Field = 'New Value'
    
    ! Save without concurrency checks
    Result = TableQCellRowSave('QueueName', QueueRef, RowNumber, FALSE)
    

Implementation Benefits

  1. Performance Improvement

    • Only loads and saves the specific row being edited
    • Avoids unnecessary data transfer and processing
  2. Concurrency Support

    • Optimistic concurrency using WATCH prevents data corruption
    • Error handling for conflict resolution
  3. Schema Evolution

    • Version tracking enables smooth migration
    • Backward compatibility maintained
  4. Minimal Code Changes

    • Existing code continues to work
    • Row-level operations available when needed

This implementation provides a comprehensive solution for adding row-level operations to the queue storage module, leveraging Clarion’s built-in features while maintaining compatibility with existing code.

Its similar to Cobol and other languages.

Where I see problems is trying to get an LLM to work with the template language and AppGen to insert embed code. My ideal would be have it build a TXA, import 3rd party templates into the TXA and then build a program, all in a TXA format ready to import into the AppGen and then compile.

I think for now, we are just going to have to cut and paste embed code into the AppGen embeds.

Writing Classes it should be able to handle, which will also help a bit.

where I think the MCP’s can help is having App specific knowledge, like file table names and field names in order to generate the embed code a bit better, plus maybe have knowledge of 3rd party addons, being used.

the point is that software not created to be callable from AI is probably in need of an upgrade and instead of software being an APP… .

you might say this is fairy land… not real … not happening thing…

really…

is that so… say it ant so joe…

you think you know what happening …

1 Like

Hello, Can I have a new Discord invite?

1 Like

See if this works.

1 Like

Thanks, it did works