Data retention policy
We retain your personal data only for as long as necessary to fulfill the purposes set out in this Privacy Policy. If you would like more information about specific retention periods please contact legal@intercom.com
Data archiving and removal policy
Upon termination or expiry of the agreement, Intercom will (at Customer's election) delete or return to Customer all Customer Data (including copies) in its possession or control as soon as reasonably practicable and within a maximum period of 30 days of termination or expiry of the Agreement, save that this requirement will not apply to the extent that Intercom is required by applicable law to retain some or all of the Customer Data, or to Customer Data it has archived on back-up systems, which Customer Data Intercom will securely isolate and protect from any further processing, except to the extent required by applicable law
Data storage policy
All Customer Data is permanently stored in the USA and is backed up for disaster recovery.
App/service has sub-processors
yes
Guidelines for sub-processors
App/service uses large language models (LLM)
yes
LLM model(s) used
We use GPT 3.5 Turbo and GPT 4. Both are used in different stages of data processing and to interpret the user query. We are experimenting with other models such as GPT-4 Turbo so this may change as newer models are released.
LLM retention settings
We don’t use the data to train an LLM. We include Slack data as part of the request to the LLM inside the prompt so there is no data retention on the LLM side. We store the data we extract from Slack in our internal database which is subject to the Intercom data retention policy.
LLM data tenancy policy
We have a multi-tenant database in AWS in the US region. Each request to the LLM includes data for only one customer. The LLM itself does not maintain any context.
LLM data residency policy
We have a multi-tenant database in AWS in the US region. Each request to the LLM includes data for only one customer. The LLM itself does not maintain any context.