Coding with Cursor: A (Somewhat) Cautionary Tale

I recently went a bit meta and used three AI services to build TrackTagging.com, my AI-based web application for musicians. The AI services I used are: ChatPRD for creating my Product Requirements doc (PRD), v0 for prototyping, and Cursor for building the app. This article is the last of the series and is about my experience with Cursor.

As great as my experience was creating the PRD with ChatPRD and the prototype with v0, this is a somewhat cautionary tale for product managers building Production-ready apps. Cursor and I did build my music app in a development environment and it only took two hours based on my PRD and prototype. It’s very cool.

But it took almost a week to deploy it to a publicly-facing Production environment. In the end, I could have really used a human software engineer with intuition and hands-on experience who could suss out the unique technical nuances required for OAuth token generation, Presigned URL endpoints, Google Console integrations, etc.

You may recall in my first two articles that I am building TrackTagging.com to enable musicians to upload a song, have the AI service “listen” to it, then provide a written description of the music along with keywords and metadata. Simple enough, right? Not exactly.

The app requires integration with APIs from Sonoteller that do the actual assessment of the music. In turn, Sonoteller requires temporary storage with Google Drive. Therefore, we need my front-end to integrate with Google Drive and Sonoteller, and play nice within Vercel’s framework, which is where I am hosting the app. 

In short, Cursor did an amazing job building the app that I designed, and worked very hard to get all of the integrations right. But, as you’ll see, a lot of skilled precision is required when integrating multiple components securely

The Good

v0 enabled me to export my prototype as a nicely organized directory with all the necessary files. Cursor had no problem importing these files and building the actual app based on my PRD. There were lots of stumbling blocks, which I get to in later sections, but the end product was exactly what I designed.

I can upload an audio file, watch a progress bar as Sonoteller does the analysis, then get awesome results that reflect the nature of my song. What’s more, I can edit any of the information I want to and then download it all as a .CSV for later use. Sweet! I link to a demo at the end of this article.

Cursor brought great software engineering “experience” to the table. It managed field-validation without a hitch. It cleverly saved the duration of a piece of music in a time-format that was usable downstream. It automagically put fields and their labels in the “right” font and justification when I added more information to the screen. Expertly, it made tons of tiny UX updates in seconds. It was like sitting next to a pro front-end dev who knew the best way to make the UI attractive and beautifully usable.

Of course, Cursor didn’t mind making all of these fixes and that was gold. Here’s why: as much as I tried to get the UX/UI right in my v0 prototype, when I was working with actual data I noticed subtle issues in the workflow that caused friction.

With Cursor I could let my creativity run wild. And that helped me elevate my app from great to sublime.

Remember, the end user is a musician who hates wasting time adding metadata to their music. The flow has to be as painless as possible. As such, it was awesome having Cursor right there to quickly try different approaches and sand down as much friction as I could.

These subtle tweaks were the kind of fine tuning that an engineer would (rightfully) want to strangle me for after a while. But with Cursor I could let my creativity run wild. And that helped me elevate my app from great to sublime.

The Caution

Let’s be clear: Cursor is amazing. It created well-crafted code, it integrated with several external platforms, and it had limitless debugging strategies. I could not have written any of this code and I was a software engineer in a past life. But it’s not all magic.

Cursor is not always great with context. On more than one occasion it would make suggestions that we had already tested but had failed. It would recommend complex code updates then ask if we should implement them. “Um okay. I guess.” <shrug> I had no frame of reference for assessing these new approaches. In all fairness if asked, Cursor would dumb-down particularly tech-y recommendations but the nuances were lost on me.

And sometimes Cursor just couldn’t get it right. It had trouble with the layout on the initial screen and kept inserting a lot of whitespace for no reason. We could not seem to get rid of it through prompts. More alarming is that it lost track of our most current version: at some point Cursor made a fix that created an alternate version of the app. When the app stopped working and Cursor tried multiple fixes, it was actually updating a different version of the code than the version I was testing. That took a whole lot of time to figure out. 😦

To get it working securely in Production I really needed a human software engineer who knew their stuff.

Obviously, communication is key, especially with AI. We all know that our prompts have to be thoughtful. Here are a few Cursor-specific recommendations:

  • Don’t ask for too many updates at once (doh). It’s less risky to make small, logical updates especially when we have our dependencies in the right order. Plus a good Product Manager is always looking for those incremental improvements.
  • Once Cursor gets something right, ask it to memorize what it did so you can repeat those steps again later. For example, Sonoteller’s API calls are fussy. Once Cursor got one right I had it save everything we did to get there.
  • Ask if a prompt makes sense. Cursor recites back its understanding of the task in its own words and often maps out a plan that directly reflects its understanding.

The Ugly

With enough wrangling, Cursor and I got the app working on my local machine. But to get it working securely in Production I really needed a human software engineer who knew their stuff. Because, setting up all of the technical requirements to deploy the app (and there weren’t even that many) and debugging took an exhausting amount of time. 

Some examples: In order to set up the app, I needed multiple Node packages and the correct OAuth tokens. Okay, Cursor helped me grab them and set them up. What I didn’t know was that I need to regularly check for updates because they do go stale. An experienced engineer would have known what to keep an eye out for.

Sonoteller has a unique and quirky way of forming data calls. It took Cursor multiple experiments to get them right. An experienced engineer would have known how to cut to the chase rather than using the trial-and-error approach that Cursor used.

During the debugging process Cursor recommended how we should address CORS errors and React State issues. Aaaa! An experienced engineer would have sent me home! They would know how to dig into them and determine if they truly were the cause of the errors. 

Cursor would give an explanation of why it recommends a particular debugging strategy. I would follow the logic but, in the end I had to take it on blind faith. Often Cursor would get it right. But occasionally it would send us down the wrong rabbit hole that took half a day to climb out of. Remember when it made an alternate version of the app without telling me?…

Results!

Did Cursor build a working app, at least on my local machine? It sure did! And it works amazingly well. Here is a video demo that goes through the whole track-tagging process. You can see how unified the design is, how streamlined the upload and analysis processes are, and how immediately usable the results are.

Does it work in a Production environment at the actual TrackTagging.com URL? Not yet. Cursor got me 90% of the way there and I have no doubt that we could get to 100%. But from a Product Manager’s perspective, I’d rather finalize the app with one of the awesome software engineers I know. It would be faster because a pro engineer can understand the unique circumstances we’re facing and take a deft approach to addressing the issues. And really, it would be a lot more fun to reach the final working results with another human.


C. Stuart Ridgway is a Principal Product Manager with 15+ years driving digital transformation for global organizations. I build products that work across diverse markets, launching new initiatives (0-to-1) and scaling existing solutions (1-to-N). My products have served top media organizations globally and millions of people across 14 countries and 14 languages. My focus is delivering measurable business impact through data-driven product strategy and execution.

All content © C. Stuart Ridgway