Participating in the Defense Tech Hackathon '24

In an era where 300 million terabytes of data are created daily from disparate sources and modalities, often with limited structure, we are on a mission to harness the power of this information and provide a strategic advantage to mission stakeholders. Our cutting-edge pipeline leverages Large Language Models (LLMs) to extract entities from the data, which are then fed into a graph database for efficient querying and analysis, with capabilities being expanded to include time series trending, sentiment analysis, and comprehensive network summaries. We are on the cusp of extending our pipeline to handle a wide array of data sources and modalities, including images, video, and telemetry data, while also developing an army of intelligent analyst agents to provide real-time, actionable intelligence.

Built With

Share this project:

Updates