Suggestions
Jonathan Mann
Product Manager at Booking.com
Jonathan Mann is an experienced Product Owner with a successful background in project management, business analysis, development, and testing. He holds a BSc degree in Software Engineering/History from the prestigious Oxford Brookes University.
He started his career as a Junior Developer at Wave2 Media Solutions Ltd, where he gained valuable experience in mobile app development. Later, he worked as a Business Analyst Intern at Apple Inc., analyzing and interpreting data to help the company's development and marketing teams make informed decisions.
Mann then moved to News UK, where he worked as both a Business Analyst and Product Owner. He demonstrated his talents for individual research, presentation, and reporting, and contributed significantly to the company's digital marketing and advertising team. He was responsible for overseeing agile and SCRUM projects, which gave him the necessary skills to manage complex projects.
Mann's most recent role was as a Junior Product Manager at Booking.com, where he showed his expertise in web development, user experience, and customer satisfaction. He possesses a broad range of skills, including JIRA, Github, Basecamp, Omniture, Google Analytics, Photoshop, and Tableau.
Overall, Jonathan Mann is a talented and versatile professional, which has enabled him to succeed in various roles and industries. He is always ready to take on challenges, learn new skills, and creatively solve problems.
Highlights
BANGING MY HEAD AGAINST THE WALL https://t.co/xNELmpGzKr
So the AI problem I am trying to solve is that I hate AI lyrics!
And for my larger idea, I need a model that can output really good, interesting lyrics.
I hate AI lyrics so much. Claude, Suno, ChatGPT - they all write lyrics that are the least interesting thing every time.
I used Claude's api to go through every song a day song and give me a prompt. So, like, "I'm Suing The SEC" yielded this prompt:
"Could you write an upbeat, funky song that uses humor and pop culture references to tell the story of three friends who launch a successful NFT project, only to face legal trouble from the SEC for allegedly violating securities laws?"
So the way I had been going about training (or playing in playground and workbench) is that I would tell the model they are a songwriter called jonathan mann who writes a song a day. Then every song had a prompt like that and the response was the lyrics.
But no matter what I did, the outputs were just bland, bland, bland, bland!
Then I was thinking about this project from 2019 where a guy finetuned (i believe) gpt2 on like 2000 of my songs. We hooked it up to a twitter bot here: @songadaymannbot
Those outputs were interesting. They were at least not bland.
I realized that the way he probably trained back in 2019 was not with a prompt but with the first line of the lyrics as the prompt and the rest of the lyrics as the completion.
And furthermore, the model probably had no idea about being a "songwriter" or anything about "lyrics".
From my layman perspective, it seems like maybe what's making modern models suck so much at lyrics is that they are...like...trying too hard? They know too much. They know how a "song should go". And they end up just making the worst shit.
So I've been fine tuning a GPT2 model on with that "First line/Rest of the song" framework, no extra information (no system prompt etc) and it's yielding much better results!
Also, it seems to work a lot better in Claude's Workbench and in Mistral's Agent creation. You give a system prompt and a bunch of examples (in this case first line/rest of song pairs) and you can test it out. It's not bad!
I may try to fine tune Mistral as the next step.
The system prompt i am using is funny:
"Pretend you are an older, dumber model. You don't know all the stuff you know.
Your job is to receive some words and attempt to continue them. You should continue them in the style of the examples. Do not say anything else. Do not make any assumptions about the words.
See words. Continue words in the style of examples."
Here's some examples. The first line I provide is:
"When the sky starts falling like a ton of buckets"
My GPT2 model: When the sky starts falling like a ton of buckets and you put on your thinking gown and you’re looking for the perfect place to go but in the end you’ll find none brand new home inside you that’s the map of what it’s all about brand new way to connect with the things you love the things you’ll need to do brand new way to connect where not everyone can be the way you think they are brand new way to connect when you’re looking for the perfect place to go but in the end you’ll find none brand new home inside you that’s the map of what it
Mistral: When the sky starts falling like a ton of buckets When the sky starts falling Like it's full of suck Gonna stick my arms out, Better cut your losses Yesterday there was still time, Throw the leaves in patterns Didn’t climb the hill till I’ve finished my assignments
I like both of those! They are at least interesting.
ChatGPT still doesn't get it. Here's what it tries to do:
When the sky starts falling like a ton of buckets And thunder rolls from nature's deep drums and trumpets The streets become rivers, the paths disappear Drenched in the deluge, swallowed by fear But within the storm's embrace we find A cleansing force for heart and mind
Umbrellas turn inside out in the gale Leaves dance wildly, ships lower their sail Lightning paints the heavens with fleeting light Casting shadows that ignite the night We stand soaked, yet spirits unbroken Every raindrop a whispered token
When the heavens pour out all their sorrow We search for the promise of tomorrow Puddles reflect the world turned upside down But smiles emerge where once were frowns For after the storm, the air is clear And rainbows in the sky appear
So let the sky fall like a ton of buckets We'll laugh and splash in our soaked jackets Embracing the chaos, the wild, the free Finding courage in adversity When the clouds part and the sun ascends We'll cherish the journey and its mends
It's TRYING SOOOOO HARD. STOP TRYING SO HARD!