Slack trains machine learning models on user messages, files, and other content without explicit permission. Training is opt-out, which means your personal data will be deleted by default. To make matters worse, you’ll have to ask your organization’s Slack admin (HR, IT, etc.) to email the company to ask them to stop. (You can’t do it yourself.) Welcome to the dark side of the new AI training data gold rush.
Corey QuinnCEO of DuckBill Group, speckled politics in a word Slack’s Privacy Principles and posted about it on X (through PCMag). The section reads (emphasis ours), “To Develop AI/ML modelsour systems Analyze customer data (eg messages, content and files) has also been introduced to Slack Other information As defined in our Privacy Policy and your customer agreement (including usage information).”
The opt-out process requires you to do everything you can to protect your data. According to the privacy notice, “To opt out, please contact our Customer Experience team at feedback@slack.com with your Organization or Workplace Owners or Principal Owner with the Workplace/Organization URL and “Slack Global model opt-out request” in the subject line. .’ We will process your request and respond once the opt-out is complete.”
Sorry Slack, you can delete user DMs, messages, files, etc. What do you do with I’m sure I’m not reading this correctly. pic.twitter.com/6ORZNS2RxC
— Corey Quinn (@QuinnyPig) May 16, 2024
Company he answered To Quinn’s message on X: “To clarify, Slack has platform-level machine learning models for things like channel and emoji recommendations and search results. Yes, clients can exclude their data from training those (non-generative) ML models.
How long has it been? A company owned by Salesforce The tidbit snuck into its terms is unclear. To suggest that customers can opt out when “customers” do not include employees working within the organization is misleading at best. They should ask everyone who manages a Slack login at work to do so – and I hope they do.
Inconsistencies in Slack’s privacy policies add to the confusion. One section reads: “Slack cannot access core content when developing Al/ML models or otherwise analyzing Customer Data. We have various technical measures to prevent this.” However, machine learning model training policies seem to contradict this statement, leaving much room for confusion.
Additionally, Slack’s web page marketing his premium generative AI tools writes: “Work without worry. Your data is your data. We don’t use it to train Slack AI. Everything runs on Slack’s secure infrastructure, which meets the same compliance standards as Slack itself.”
In this case, the company is talking about him premium generative AI tools, separate from the machine learning models it trains without explicit permission. However, as PCMag notes that a company stating that all of your data is safe from AI training is, at best, a very misleading statement when they probably get to pick and choose the AI models that cover that statement.
Engadget attempted to contact Slack through multiple channels, but did not receive a response at the time of publication. We’ll update this story if we hear back.