It was gpt-4 I was using. It could be that you wrote it as one instruction and your intensions were very clear from the beginning while I explained it across multiple changes and clarifications when I noticed it wasn’t giving me quite what I wanted.
Part of it is that I was intentionally being very human in my instructions, leaving it open to interpretation and then clarifying or adding things as I brainstormed. Its a messy way of doing it but if AI needs to be able to handle messy instructions in order to be considered on par with people.
Edit: turns out it wasnt gpt-4 I was using i was using the free chat on openais website. I was not aware that they were different.
It was gpt-4 I was using. It could be that you wrote it as one instruction and your intensions were very clear from the beginning while I explained it across multiple changes and clarifications when I noticed it wasn’t giving me quite what I wanted.
Part of it is that I was intentionally being very human in my instructions, leaving it open to interpretation and then clarifying or adding things as I brainstormed. Its a messy way of doing it but if AI needs to be able to handle messy instructions in order to be considered on par with people.
Edit: turns out it wasnt gpt-4 I was using i was using the free chat on openais website. I was not aware that they were different.