It would be on them if the AI generator was forthcoming about how it "created" the image. If a company like MS is advertising that they have a program that creates new content, but that program actually has a known tendency to output content from it's training set then they bear responsibility, especially if that program is being run remotely on their servers.