No, its the part where you misrepresent somebody else's creation as your own. if its not alright for a person to do then its not alright to automate it. stop trying to play dumb semantic games, its not nearly as clever as you think it is.
It would be on them if the AI generator was forthcoming about how it "created" the image. If a company like MS is advertising that they have a program that creates new content, but that program actually has a known tendency to output content from it's training set then they bear responsibility, especially if that program is being run remotely on their servers.