Whatever you say to Apple (AAPL 0.46%) and Microsoft's (MSFT 2.47%) personal assistants Siri and Cortana may not be as private as you think.

Both companies store data input into their voice assistants including voice commands, location information, and, well, basically everything you say. Apple explains its recording and storage policy in its privacy policy for OS 8.1. The policy, which is easy enough to find if you are looking for it, but nothing anyone would ever stumble across if you aren't, spells out every manner in which your interaction with your phone is used by Apple, but bolds the following in order for it to stand out:

By using Siri or Dictation, you agree and consent to Apple's and its subsidiaries' and agents' transmission, collection, maintenance, processing, and use of this information, including your voice input and User Data, to provide and improve Siri, Dictation, and dictation functionality in other Apple products and services.

Microsoft isn't quite as blatant in spelling out how it records data input into Cortana in its Windows Phone 8.1 privacy policy but it does point out that the company stores your data repeatedly. 

Cortana sends speech data to Microsoft to build personalized speech models and improve speech recognition for Cortana and other Microsoft products and services. Speech data sent to Microsoft includes voice recordings, as well as associated performance data ... Cortana also collects and uses other information, such as the names of your contacts, how often you call them, the titles of your calendar events, and words you've added to the dictionary. 

Both companies also make it fairly easy to shut off these features, but doing so makes the personal assistants more or less useless.

Why are Apple and Microsoft doing this?
Apple and Microsoft spell out that they record and store data input into their voice assistants to make the experience of using them better. Both do that on a personal level -- getting to know what you mean when you verbalize certain commands -- and on a broader level among all users of the app.

It's not as scary as it seems
Apple addressed consumer fears about its data handling to Wired in an April 2013 article. In that story, Apple spokesman Trudy Muller told the magazine that the company does store data for two years, but it becomes anonymous after six months.

"Once the voice recording is six months old, Apple 'disassociates' your user number from the clip, deleting the number from the voice file," she said in a call to the magazine. "But it keeps these disassociated files for up to 18 more months for testing and product improvement purposes."

Apple also makes it clear in another privacy policy section on its website that it needs to access data in order to improve your experience while emphasizing that the data is encrypted,

The longer you use Siri and Dictation, the better they understand you and the better they work. To help them recognize your pronunciation and provide better responses, certain User Data such as your name, contacts, and songs in your music library is sent to Apple servers using encrypted protocols. That said, Siri and Dictation do not associate this information with your Apple ID, but rather with your device through a random identifier.

That random identifier can be turned off and on at any time "effectively restarting your relationship with Siri and Dictation." Turning Siri and Dictation off deletes the user data associated with your Siri identifier and "the learning process will start all over again."

Microsoft does not say how long it stores data from customers which allow it to do so, but the company does offer detailed instructions on how to turn Cortana's data storage off as well on its "Cortana and my privacy page." It's possible to have a fairly strong level of control over what information Microsoft receives. You could for example, allow Cortana to have access to your location, but not access to dictated emails. 

It's also possible to delete your Cortana data saved both on your phone and in the cloud and Microsoft provides instructions on how to do that here.

Apple and Microsoft need to do this(sort of)
It's reasonable for Apple and Microsoft to use data input into Siri and Cortana to improve their voice assistants. But, consumers should understand that this is happening and consider that digital privacy is somewhat of an illusion -- at least if you don't take steps to manage your digital profile.

For example, the following search is now sitting on my iPhone, and stored on Apple's servers associated to me for the next six months. 

Siri screenshot Source: Author. 

That would be bad if I planned to commit a crime, but no different than browser history or any other data which can be extracted from a computer, phone, or even those electronic tags used to pay tolls.
Be careful, but be reasonable
Yes, it's a little disturbing to know that Apple and Microsoft have records of your requests to find stores selling hemorrhoid cream or late-night queries for White Castle locations, but it's mostly harmless. The info is not being used to do anything other than improve your experience with Siri or Cortana.

While there is more public info on Siri than Cortana, both companies have gone to great lengths to explain that any storage of data is not to embarrass customers, but to make the voice assistants more responsive. Knowing that this data is kept should make you a little more cautious in the same way the Sony hacking incident made people realize that deleted email may not be actually gone.

So, do be a bit careful with what you confess to Siri or Cortana, but don't let privacy concerns stop you from using either one.