I love the fact that you made this point!! The whole science behind curating data has been relegated to back-end drudgery work, in the eyes of many where few even bother to acknowledge its importance. This could be the reason for the occasional "Butterfly effect" in AI - and by this I mean prompt responses going way off target as a result of improper data curation at the early stages. To some extent, the AI practitioners themselves share a part of the blame for exalting this how-to-prompt-your-model as the way forward to succeed in tomorrow’s world.