After Nine blamed an ‘automation’ error in Photoshop for producing an edited image of Georgie Purcell, I set out to find out what the software would do to other politicians.
After Nine blamed an ‘automation’ error in Photoshop for producing an edited image of Georgie Purcell, I set out to find out what the software would do to other politicians.
Your hypotheses makes no sense?
People generating porn would make no change to its training data set.
You wouldn’t feed the images people generate and save back into the system to improve it?
This actually doesn’t work to improve the model, generally. It’s not new information for it.
Yup. But they would logically have bots up to troll for new posts and would be consuming social media posts with their own generated data.
Also they would absolutely feed in successful posts back into the system. You’d be stupid to not refine successful generations to further help the model.
Not after the initial training, no.
That would make it less effective, because instead of being trained on known real things it’s being further reinforced on its own hallucinations.