cynkra


Intelligently R

We integrate AI into data and analytics workflows, build open-source AI tools, and provide infrastructure support for R and Posit environments.

What We Help With


decorative background

Agentic coding tools that write, refactor, and debug code autonomously, plus inline autocomplete for faster development. We set up Claude Code, GitHub Copilot, Cursor, or local alternatives, and have experience integrating these into R/Posit and restricted government environments.

Coding

decorative background

Make internal documents searchable with AI. Answers grounded in your data, not generic responses. Fully local deployment possible, so data never leaves your servers. Built with ChromaDB, DuckDB, and rchroma, our open-source R package.

Search

decorative background

Ask questions of databases and datasets in plain language. Create plots, filter data, generate summaries by describing what you want. Non-technical team members can explore data on their own, no coding required. Built on blockr.ai, our open-source framework.

Analysis

How We Work

From first conversation to running system.


Understand

1

Discovery

We discuss your goals, figure out where AI helps, and assess your constraints: hardware, budget, data security, and existing infrastructure.

Architect

2

Design

We pick the right setup: cloud, self-hosted, or hybrid. We select the tools and models that match your use case and security needs.

Deliver

3

Implementation

We set up the system, integrate it into your environment, and train your team. We stay available for ongoing support, adjustments, and further development.

Featured Projects


rchroma provides an R interface to ChromaDB, enabling vector database operations directly from R. It supports embedding storage, retrieval, and similarity search for AI applications.

Key Features

  • Vector database operations in R
  • Storage and retrieval of embeddings
  • Similarity search for AI applications
  • Integration with R machine learning workflows
open source project

Our ask package lets you interact with AI models directly from R, going beyond simple text responses.

Key Features

  • Script and documentation editing in place
  • Code and test generation
  • Package documentation querying
  • Natural language data processing
  • Support for both cloud (GPT-4) and local (LLama) models
open source project

blockr.ai extends our blockr framework with AI capabilities for natural language-driven data analysis.

Key Features

  • AI-powered plot creation through natural language
  • Intelligent data transformations
  • Integration with leading AI models
  • Composable blocks for flexible workflows
  • Seamless integration with the blockr ecosystem
open source project

Cloud vs. Local


  • Cloud with Zero Data Retention

    Claude API, GitHub Copilot Business, Azure OpenAI. Best performance, no data stored by the provider.

  • Self-Hosted LLMs

    Run models on your own infrastructure with Ollama, Open WebUI, or dedicated GPU servers. Data stays in your network.

  • Hybrid

    Cloud for non-sensitive tasks, local models for sensitive data. We use a decision matrix to map each use case against data sensitivity, quality needs, and budget.

AI Insights

Practical examples of working with LLMs in R

post image

David Schoch, Christoph Sax /

R with RAGS: An Introduction to rchroma and ChromaDB

LLM/RAG/R

Large language models (LLMs) are developing rapidly, but they often lack real-time, specific information. Retrieval-augmented generation (RAG) addresses this by letting LLMs fetch relevant documents during text generation, instead of just using their internal—and potentially outdated— knowledge.

post image

Christoph Sax /

Playing with AI Agents in R

LLM/R

It's local LLM time! What an adventure it has been since I first started exploring local LLMs. With the introduction of various new Llama models, we now have impressive small and large models that run seamlessly on consumer hardware.

post image

Christoph Sax /

Playing with Llama 3.1 in R

LLM

Meta recently announced Llama 3.1, and there's a lot of excitement. I finally had some time to experiment with locally run open-source models. The small 8B model, in particular, produces surprisingly useful output, with reasonable speed.