This page lists all articles in reverse chronological order. Use the Categories page to view the list of categories or Search to locate a particular article.
I recently received an email at work from a recruiter, but something seemed off. Upon closer inspection, I realized it was actually generated by an AI based on my resume. The email included job recommendations and was surprisingly honest about being sent by an AI representative named Jenny Johnson. It's a glimpse into the future of job searching in a World Without Apps.
I've been a GitHub Copilot user since the public beta; using GitHub Copilot in Visual Studio Code dramatically improves my efficiency writing code, especially for NodeJS or React Apps. This post explains my first experience using Windows Copilot
The recent incident involving a lawyer using ChatGPT to write a court brief and it backfiring due to fabricated content highlights the limitations of the algorithm. ChatGPT is a next word prediction algorithm, not a research engine, so it generates content based on its training data. In this case, the algorithm made up references in the legal brief because it lacked real examples. To avoid such issues, combining next word prediction algorithms with research algorithms can ensure accuracy and relevance in generated content. The incident underscores the importance of understanding the capabilities and limitations of standalone AI algorithms.
There's been a lot of press and anxiety lately about ChatGPT and other chat-based AI tools. In my view, the media isn't describing these tools very well, fostering the worldwide anxiety by choice, so I wrote this post to provide the details as I see them.
You see examples of AI everywhere nowadays, but what we don't see is many examples of multiple models wired together to accomplish something. What we have is little islands of AI which minimizes its impact.