I’ve looked at Hickey’s example and the interface and there is no provision for adding files to the chat.
I’ve scanned the ClarionLive videos and do not see any that discuss this.
I’ve looked at Hickey’s example and the interface and there is no provision for adding files to the chat.
I’ve scanned the ClarionLive videos and do not see any that discuss this.
Hi Paul
The Anthropic API itself doesn’t have a concept of attaching files to a chat like the web UI does.
If you’re using Clarion to call the API, you’d need to read the file contents yourself and include them in the prompt (possibly chunked if large).
Tools like Claude Code and GitHub Co-pilot Chat can “see” files because they provide that context outside of API’s, but that’s not something the API does for you automatically.
If you explain what you’re trying to achieve (e.g. summarising files, analysing code, etc.), there may be a pattern other users can suggest.
Mark
This appears to also allow files for file context to help reduce hallucinations, but it doesnt list what file extensions are accepted, so its probably the default Claude one’s which are PDF, DOCX, TXT, CSV, and XLSX, upto 20 files per chat each with a file size of 30MB.
VS marketplace and Github links below.
Dont know if this can be adapted for the Clarion IDE, if it can it should do what you want.
This is from Nov 2024, so possibly a little bit out of date in terms of features offered by LLM.
That API seems to have it in Beta - with code examples in their API kits (except for C#)
We’ve been using a front end called “big-AGI” hosted internally which has a provision for file uploads, but, as you mention, this front end could just be including it in the chat.
Thanks Rich. I’ll take a look at those resources.
FYI, I’ve used all the file types you mention, but XLSX seems to be a problem for them.
We had total hallucination when using XLSX files. The work around was to save as PDF and use the PDF.
What I am trying to do is create an interface where we can gather specific data from certain places and apply a specific (and consistent) prompt and record the results.
We would point to a folder of data files - and the prompt would vary based on a couple of factors.
This would be repetitive.
Its still in beta.
This is why I think a local LLM is a better route, more control provided dev’s are aware of the overtraining issues.
See Table of spreadsheet reading limits here
I think a RAG agent will be better for you.
Building a Claude Retrieval-Augmented Generation (RAG) system involves creating a pipeline to ingest, embed, and store data in a vector database, then using Claude 3.5 Sonnet to query this data. Key steps include using LangChain or LlamaIndex to chunk documents, ChromaDB or MongoDB as the store, and utilizing Anthropic API keys to generate context-aware answers, ideally employing contextual embeddings for improved accuracy.
Dont know if this is of interest…
The plugin enforces disciplined practices like “red-green-refactor” TDD cycles where tests must fail before implementation, a four-phase debugging methodology that requires root cause investigation before any fixes, and Socratic brainstorming sessions that refine requirements before coding begins.
well we dont have an API in clarion but our linux binding machine in CPP is now starting to beta test on window and we selected old clarion win32 to do the testing and yes .. claude can be accessed.. using the binding machine 200 KB dll compiled in CPP…
Link the CPP code base into clarion projects which is super simple as old clarion was built in the “OLD DAYS” of computing… and really connecting to CPP DLL’s can be done by well anyone with a version of clarion back to version ? and looking up the help.. you dont even need to be a computer scientist…you could probable even link in the base claude CPP SDK.. but we prefer to do it at a script level that smaller than python and uses direct address calling.. no python runtime over head or open claw overhead.
CODE
Ai
Ai PROCEDURE
AIScripts cstring(2000)
AIKey cstring(255)
AIPrompt cstring(1000)
AIResponse cstring(2000)
docID long
CODE
! set up and connect to AI via UBS.
SpecialFoldersref &= new(SpecialFolder) ! Document Folders
NewUBSInstance
if AppiUBSref &= NULL
return
END
AIPrompt = 'What is the UK individual personal income tax rates for 2024?'
! Build UBS ai bindable script.
! Semicolons separate statements. Last expression is the DocStrId return value.
AIScripts = 'ai.provider(''claude'');' & |
'ai.key(''' & clip(AIKey) & ''');' & |
'ai.model(''claude-haiku-4-5-20251001'');' & |
'ai.system(''You are a tax assistant.'');' & |
'ai.ask(''' & clip(AIPrompt) & ''');'
! Execute script - last expression result returned via DocStrId.
docID = AppiUBSref.Document(AIScripts)
if docID > FALSE
AppiUBSref.DocId(docID)
AIResponse = AppiUBSref.DocStrId(docID)
AIResponse = StripUTF8Chars(AIResponse)
TextManagerDisplay(AIResponse)
END
The binding machine gives us everything Python has and more since AI bound itself into the machine.. THATS right we did not plumb AI into UBS… AI DID!!! AI is now set up to generate CPP bindable components for any library or Operating system you point it at.. You could even create a new Open Claw yourself using the above example.
Anthropic have sent out an email saying 3rd party harnesses ie OpenClaw now requires a new seperate subscription ontop of the existing subscription since yesterday (4th April).
Its a price rise basically although I have noticed OpenClaw has been getting alot of attention recently.
Any log on with the API key uses a token charge instead of your sub… in the example posted above its a key access to the service. But there are free AI services and the example above just required a change to the provider.. “OLD CLARION” does the job perfectly…dont you love when decades old code keep going and you can go sailing…
and Dont forget your open claw or co worker models if created in clarion will run in how much memory? hardly any… try researching co worker and see how memory it uses…
AI bound the OS windows functions into the above example and added some CPP jason functions.. virtually no over head to call AI services and return the results to clarion… AI though it was good use of memory…
Ha det bra vi ses
I must be the unluckiest person on the planet then because I’ve not been able to get ChatGPT, Co-Pilot or Gemini to generate any fully working clarion code, in fact Co-Pilot has even quoted my own bug ridden code in a hidden repo on github.
we havnt used AI to generate Clarion code only CPP code.
Claude Test for generating a clarion class with not much info, Just the interface definition for a bindable that can be registered in the Binding Machine (forth like processor written in CPP but UBS has more stacks than forth and everything is a BINDABLE) … we use it on linux to replace python as UBS can run on very small devices and takes up only 200KB on startup.
Looks like claude can generate clarion code.. Maybe not complex stuff..
! (c) Copyright Quantum Dynamics Ltd 2025
! CUniBindable - Base Clarion implementation of IUniBindable
! Implements IUniBindable,com to be vtable-compatible with UBS CPP ScriptBindable
! Subclass and override methods as required for specific bindable behaviour.
CUniBindable CLASS,TYPE,IMPLEMENTS(IUniBindable)
! Properties
! Lifecycle
Construct PROCEDURE
Destruct PROCEDURE
END
cuniref &CUniBindable
CODE
cuniref &= new(CUniBindable)
message(size(cuniref))
! ============================================================================
! Construct
! ============================================================================
CUniBindable.Construct PROCEDURE
CODE
! ============================================================================
! Destruct
! ============================================================================
CUniBindable.Destruct PROCEDURE
CODE
! ============================================================================
! IUniBindable.getAsInteger
! ============================================================================
CUniBindable.IUniBindable.getAsInteger PROCEDURE()
CODE
RETURN(0)
! ============================================================================
! IUniBindable.getAsNumber
! ============================================================================
CUniBindable.IUniBindable.getAsNumber PROCEDURE()
CODE
RETURN(0.0)
! ============================================================================
! IUniBindable.getAsString
! ============================================================================
CUniBindable.IUniBindable.getAsString PROCEDURE()
AStrValue ANY
CODE
AStrValue = ''
RETURN(AStrValue)
! ============================================================================
! IUniBindable.copy
! ============================================================================
CUniBindable.IUniBindable.copy PROCEDURE()
CODE
RETURN(0)
! ============================================================================
! IUniBindable.setToInteger
! ============================================================================
CUniBindable.IUniBindable.setToInteger PROCEDURE(LONG integer)
CODE
! ============================================================================
! IUniBindable.setToNumber
! ============================================================================
CUniBindable.IUniBindable.setToNumber PROCEDURE(REAL number)
CODE
! ============================================================================
! IUniBindable.setToString
! ============================================================================
CUniBindable.IUniBindable.setToString PROCEDURE(CONST *CSTRING string)
CODE
! ============================================================================
! IUniBindable.setToBindable
! ============================================================================
CUniBindable.IUniBindable.setToBindable PROCEDURE(LONG b)
CODE
! ============================================================================
! IUniBindable.invoke
! ============================================================================
CUniBindable.IUniBindable.invoke PROCEDURE(LONG ifc)
CODE
RETURN(0)
! ============================================================================
! IUniBindable.selectMember
! ============================================================================
CUniBindable.IUniBindable.selectMember PROCEDURE(CONST *CSTRING memberName, LONG equ, LONG returnequaddress)
CODE
RETURN(ADDRESS(SELF))
! ============================================================================
! IUniBindable.removeMember
! ============================================================================
CUniBindable.IUniBindable.removeMember PROCEDURE(CONST *CSTRING memberName)
CODE
! ============================================================================
! IUniBindable.existsMember
! ============================================================================
CUniBindable.IUniBindable.existsMember PROCEDURE(CONST *CSTRING memberName)
CODE
RETURN(0)
! ============================================================================
! IUniBindable.nextMemberName
! ============================================================================
CUniBindable.IUniBindable.nextMemberName PROCEDURE(CONST *CSTRING memberName)
CODE
RETURN(0)
! ============================================================================
! IUniBindable.share
! ============================================================================
CUniBindable.IUniBindable.share PROCEDURE()
CODE
! ============================================================================
! IUniBindable.unShare
! ============================================================================
CUniBindable.IUniBindable.unShare PROCEDURE()
CODE
! ============================================================================
! IUniBindable.getDescription
! ============================================================================
CUniBindable.IUniBindable.getDescription PROCEDURE(BYTE metatype)
AStrValue ANY
CODE
ADescValue = 'CUniBindable'
RETURN(ADescValue)
! ============================================================================
! IUniBindable.getBinding
! ============================================================================
CUniBindable.IUniBindable.getBinding PROCEDURE()
CODE
RETURN(ADDRESS(SELF))
I think part of the problem is that people are expecting AI to already be a Clarion programmer, and it is not.
I use AI every day, probably 8 to 12 hours a day, using Claude, Gemini, and ChatGPT, mostly ChatGPT, and I generate thousands of lines of good Clarion code every day. Real code, real problems, real work.
But that does not happen because the AI somehow fully knows Clarion. Clarion is a niche language. These models have seen enormous amounts of Python, JavaScript, C#, Java, and other mainstream languages, but nothing like that volume of Clarion. So you cannot just ask for Clarion code and assume the model already has the right patterns in its head.
The good news is that it does not need to know all of Clarion. It only needs to know enough about the part of Clarion you are working with, the context it is working in, and the rules you want it to follow.
That is really the key. You have to give it the right frame of reference. Show it what good looks like, what done looks like, and what bad looks like. Give it your standards, your patterns, and your do-not-do list. Keep it grounded in the current task instead of treating it like a generic programmer you just hired off the street.
Once you do that, it can be extremely effective. In my experience, the issue is usually not that AI cannot generate good Clarion code. The issue is that people are assuming too much and guiding it too little.
anthropic have update information on prompting and setting the Tag messages..
Yes, and I think that actually supports the point.
Indeed Anthropic has updated their prompting guidance and they do recommend using tags and clearer structure. That helps a lot.
But when you break it down it means is that the more clearly you separate instructions, context, examples, and expected output, the better the model performs.
In a niche language like Clarion, that matters even more.
So to me the take away is “give it the right structure, context, and guardrails, and it can do very good work.”