Do you want to bring #generativeAI to your mission, but you work in a classified or air-gapped environment? Major #AI services, like ChatGPT, run in the cloud and use Internet-facing APIs. That's a no-go if you're disconnected, and even if you have connectivity, you may not be allowed to share your sensitive mission data with a third-party service. We created #LeapfrogAI to overcome these problems by enabling you to self-host generative AI models using your mission data in your mission environment.   Want to learn more? Join us for our LinkedIn Live event this Thursday at 1500ET. 🔗https://lnkd.in/gGma-ygF Gerred D. and Barron Stone will jump into the world of LeapfrogAI, including live demos and Q&A.

Not trying to be contrarian, but that hasnt been my experience. I have a gpt llm that lives on my laptop locally without phoning home anywhere. It was relatively easy to install, and while its only a tiny 13b model, it performs at a rate of about 80-90% of chat gpt 3.5 In some cases I've been able to get it to do things that chatgpt 3.5 can't do, like solving the rooster, donkey, potatoe riddle. It took a lot of back and forth to help it understand, but a greater amount of effort did not work with openai's chatgpt 3.5 I'm not totally sure but I think with some training and fine tuning I can get it to perform far better in some specific areas I'm interested in. I'm only doing this on the laptop first to see if its worth putting on the ol power edge r730. Anyways, since your stuff is open source, I'll probay give it a spin as well. Alpaca/lora/vicuna have made some really good progress on running on smaller hardware.

I’ve heard about some lessons learned over at CDAO, would like to learn how you might have addressed the problems they’ve seen. If anyone has time to chat about Leapfrog, I’d love to learn more

Like
Reply

Great video!! Commenting for network interested in AI in secure environments!

Barron - thanks for the overview, definitely an interesting solution, and I appreciated your point at the end about connecting GenAI solutions to organizational data. This is something that I think is getting lost in the hype around GenAI and one of the first things that I bring up to a customer interested in this technology. If you've got a fragmented or siloed data architecture, the best LLM in the world won't be able to meet the needs of your mission. A logical first step is to get your data house in order and then bring an LLM to meet a specific use case. Then you start to scale and expand. I just get the sense that folks out there are trying to do too much, too quickly. Have you seen some of the work that Elastic and David Erickson have been doing to support GenAI use cases from a data perspective? This blog might be up your alley - https://www.elastic.co/search-labs/privacy-first-ai-search-langchain-elasticsearch .

Like
Reply

Will this be recorded for watching if people unavailable at the designated time?

Amber Whittington Love it! If DoD has Big Data Platform (BDP) which is fully accredited government owned platform; same systems can be done to #AI service—ChatGPT.

See more comments

To view or add a comment, sign in

Explore content categories