The Cupertino, Calif.-based company pulled it from the App Store around 1 a.m. EST on Tuesday. The startup’s COO, Evgeny Tchebotarev, told TechCrunch that Apple doesn’t want children to search and find nude photos unintentionally via the app:
The move came shortly after last night’s discussions with Apple related to an updated version of 500px for iOS, which was in the hands of an App Store reviewer. The Apple reviewer told the company that the update couldn’t be approved because it allowed users to search for nude photos in the app. This is correct to some extent, but 500px had actually made it tough to do so, explains Tchebotarev. New users couldn’t just launch the app and locate the nude images, he says, the way you can today on other social photo sharing services like Instagram or Tumblr, for instance. Instead, the app defaulted to a “safe search” mode where these type of photos were hidden. To shut off safe search, 500px actually required its users to visit their desktop website and make an explicit change.
Tchebotarev clarified that 500px does not allow pornography, as it is against the service’s terms and conditions, and the nudes found within the community tend to include an “artistic” nature. The app also depends on users to flag inappropriate images, but it is working on a feature that will auto-identify and tag nude images so they won’t appear in search.
500px told Apple yesterday that it would make any necessary changes to the app to rectify the situation, but Apple apparently couldn’t wait. Tchebotarev said, as retold by TechCrunch, “the changes 500px promised Apple should be done now and are being submitted immediately.”
@mkshft yep, it's about nude pictures. But isn't Facebook and Snapchat and Instagram is about that?—
Evgeny Tchebotarev (@tchebotarev) January 22, 2013
Update: An Apple spokesperson supplied The Next Web with the following statement about the removal:
The app was removed from the App Store for featuring pornographic images and material, a clear violation of our guidelines. We also received customer complaints about possible child pornography. We’ve asked the developer to put safeguards in place to prevent pornographic images and material in their app.