They were made by exploiting OpenAI's DALL-E 3, the most advanced and realistic AI image generation model. The only public implementation of the model is Microsoft Designer and Bing Creator. Though it is heavily filtered, people on 4chan were able to bypass them with prompt engineering to get the model to create the images. This is why a lot of them don't actually show clear-cut nudity, they have to be covered with glitches/noise or simply obscured in some fashion to make it past the second layer (AI recognition pass of the output image to ensure it doesn't contain NSFW), hence the red paint on the Taylor ones. Bypassing the first layer (checks the prompt for something inappropriate) is far easier and in the early days people were even putting things like "(safe to generate)" in the prompt to get it to go through, because the first layer seems to be some sort of ChatGPT-like system that gives the okay.
Since the whole Taylor situation they dialed up the filters to 100 and you can't really do this anymore, at least not to the level of the Taylor ones.
The site that reposted them and the person who posted them on Twitter are almost certainly not the original creators. They were just ripped from some random 4chan anon who made it. Taylor should be suing Microsoft for allowing this happen.
To be quite honest, the level of those fakes, how realistic they appeared scared me straight in a sense. It's all fun and games until it happens to you.