Suggestions
Jesus Rodriguez
Entrepreneur and Investor with extensive experience in AI and Blockchain
Jesus Rodriguez is a prominent technology expert, entrepreneur, and investor based in Miami, Florida.1 He is currently an Investor and Board Member at Trident Digital, a position he has held since September 2023.1
Professional Background
Rodriguez has an extensive career in technology and entrepreneurship:
Current Roles::
- Co-Founder and CEO of Intotheblock (since November 2018)1
- Co-Founder and President of Faktory (since January 2023)1
- Co-Founder and President of NeuralFabric Corp. (since August 2023)1
- Chief Scientist and Managing Partner at Invector Labs (since January 2018)1
Academic Involvement::
- Guest Lecturer at The Wharton School and Columbia University1
Entrepreneurial Ventures:: Rodriguez has founded or co-founded several companies, including Tellago, Inc. and Tellago Studios.1 He is also the co-founder and editor of The Sequence AI, a publication focused on artificial intelligence.1
Investment Portfolio
As an investor, Rodriguez has stakes in various technology and blockchain companies:
- Flare Network
- Snickerdoodle Labs
- Mnemonic, Inc
- Tres.finance
- Enhanced Digital Group (EDG)1
Expertise and Recognition
Jesus Rodriguez is known for his expertise in:
- Cryptocurrency and blockchain technology
- Artificial intelligence
- Enterprise software development
- Mobile app platforms2
He is an internationally recognized speaker and author, contributing to hundreds of articles and presentations at industry conferences.2 Rodriguez's background as a software scientist has made him a sought-after advisor for major tech companies like Microsoft and Oracle.
LinkedIn Profile
His LinkedIn username is indeed jesusmrv, where he maintains an active presence sharing insights on technology, particularly in the areas of DeFi (Decentralized Finance) and cryptocurrency.3
Highlights
Most “open-source” LLMs today are not really open-source at all – they’re open-weights: you get a giant tensor file, but not the full story of how it came to be.
Olmo 3 https://t.co/L1ZZTkkayF might be the most actually open open model we’ve seen so far.
Most “open” LMs today are really just open-weight: you get a frozen checkpoint, a restrictive license, and some eval numbers. With Olmo 3, AI2 is open-sourcing the entire model flow — data recipes, training code, RL stack, eval harness, checkpoints. You don’t just get the dish, you get the kitchen, the ingredients list, and the cooking logs.
Concretely:
A family of 7B & 32B models (Base, Instruct, Think) aimed at long-context reasoning, coding, tools, general chat. A strong, fully open 32B base model plus a “Think” variant tuned for multi-step reasoning and chain-of-thought–style tasks. All the plumbing exposed: Dolma/Dolci data stack, dedup + filtering pipelines, pretraining code, instruction-tuning + RL framework, RL datasets, and an evaluation suite wired for honest comparison.
Why this matters:
Reproducibility – You can rerun the training, inspect any stage, and ask “what happens if I change this?” Instead of speculating about scaling laws or RL effects, you can just run the experiment. Real science – This is a platform for ablations: data slices, optimizer swaps, architecture tweaks, safety interventions. Closed APIs reduce you to prompt engineering; open flows let you do model engineering. Forkability – Want a math-only Think model? A finance-tuned Olmo? An RL recipe aligned with your own reward model? You can branch from a state-of-the-art checkpoint and keep the full lineage.
From a “Karpathy mental model” perspective, Olmo 3 feels like shipping the whole training script for the brain, not just the frozen cortex. Students can learn end-to-end training at modern scale. Researchers can test interpretability ideas on a live, evolving stack. Startups can build specialized copilots without betting everything on a black-box vendor.
If we think of LLMs as the compilers and operating systems of the AI era, Olmo 3 is closer to GNU/Linux than to a SaaS endpoint. In a future where most frontier models are closed, fully open model flows like this might be how we keep AI a science rather than just a subscription — something you can read, modify, and understand, not just query.
Try to outline some practical hard truths about building sustainability in DeFi ecosystems based on lessons learned at @intotheblock . Thanks @CoinDesk for publishing.