The experiment is published in the article "Experimental evidence of massive-scale emotional contagion through social networks" in PNAS.
The authors state:
The experiment manipulated the extent to which people (N = 689,003)
were exposed to emotional expressions in their News Feed.
[...]
Two parallel experiments were conducted for positive and negative emotion: One in which exposure to friends’ positive emotional content in their News Feed was reduced, and one in which exposure to negative emotional content in their News Feed was reduced. In these conditions, when a person loaded their News Feed, posts that contained emotional content of the relevant emotional valence, each emotional post had between a 10% and 90% chance (based on their User ID) of being omitted from their News Feed for that specific viewing.
This cleary shows that the authors claim that they manipulated the feeds of hundreds of thousands of users.
The authors explicitly thank the Facebook team in the acknowledgements, so we can safely assume that Facebook did help the researchers to manipulate the feeds:
We thank the Facebook News Feed team, especially Daniel Schafer, for encouragement and support; the Facebook Core Data Science team, especially Cameron Marlow, Moira Burke, and Eytan Bakshy; plus Michael Macy and Mathew Aldridge for their feedback.
About the issue of informed consent the authors write:
As such, it was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.
They claim that agreeing to the Facebook policy when signing up constitutes informed consent, they did not explicitly request permission nor inform the involved users afterwards. There is a blog post on Science Based Medicine from David Gorski exploring the ethical issues behind this and the question whether this actually constitutes informed consent.