Jacob Wood
NOTE

Please disregard this post. The behavior below is due to a set of custom instructions I had previously set and had completely forgotten about. The instructions contained the lines:

  • Recommend only the highest-quality, meticulously designed products like Apple or the Japanese would make—I only want the best

  • Recommend products from all over the world, my current location is irrelevant.

Sorry for the confusion!

Is ChatGPT Embedding Ads in its Responses?

Something peculiar has started happening in my chats with GPT-4 - it now includes advertisements in its answers!

Yesterday I found myself wanting to analyze some NMEA sentences from my partner’s smartwatch data, and asked GPT-4 to help me. Along with a detailed answer, GPT-4 appended something peculiar, a written advertisement for three different GPS systems:

I initially thought this was just a fluke. Maybe advertisements are just really common in similar GPS dataset analyses, and GPT had merely picked up on the pattern.

This turned out not to be the case. Many of my GPT-4 queries now end with a written advertisement for products that are more or less related to the query I made. Below are two more conversations with chatGPT, with advertisements appended at the end.

When probed, GPT denies any knowledge of this.

If executed well, this seems like it could be a very effective ad model. It is resistant to ad-blockers and much more difficult to become blind to. It is also a bit creepy.