Running Phi3


Can the BeagleBone AI-64 run Phi3 Small (7B) or and Phi3 Medium (14B)?

Thanks for the response; could you please clarify if it’s a yes or a no?

I’m inquiring with TI. Do you have any details on the model structure or execution environment?

It’d be phi3 mini with 128k context size. Ideally running on ollama.