Data retention policy
Knowi customers create and remain in control of your data and data about your users and user activities and reports. When you remove users from your Knowi instance, their data will be removed from Knowi's systems and databases within 30 days and within 30 days they will no longer remain in Knowi's cache.
If you are a Knowi user and wish to delete a Knowi user's account data, please contact your Knowi Administrator or internal compliance decision-maker for assistance. At the request of our customers, we have a process to permanently anonymize the data by data engineering. Knowi Administrators may either self-serve directly from Knowi or Contact Us to request assistance.
Data archiving and removal policy
Knowi retains information for an active user until a user becomes inactive, at which point the data will be archived for 180 days before it is removed. At the request of our customers, the removal process can be expedited and processed within 7 days of receiving a request.
Data storage policy
Knowi is committed to implementing and maintaining reasonable and appropriate technical, physical, and administrative safeguards to protect your personal data. However, no company, including Knowi, can guarantee the absolute security of Internet communications. For more information, please see our Privacy page webpage or contact us for questions.
Data hosting details
cloud-hosted
App/service has sub-processors
no
App/service uses large language models (LLM)
yes
LLM model(s) used
Knowi supports multiple LLM providers via API integration, configured by each customer. Supported providers include OpenAI (GPT-4, GPT-4o, GPT-3.5), Anthropic (Claude), Azure OpenAI, and other API-compatible models.
LLM retention settings
LLM-side retention is governed by the policy of the provider each customer chooses and the terms of that customer's own agreement with the provider (e.g., OpenAI API standard 30-day abuse-monitoring retention, or Zero Data Retention.
LLM data tenancy policy
Knowi does not operate its own LLM. Customers bring their own LLM provider credentials (e.g., their own OpenAI, Anthropic, or Azure OpenAI API key) and connect it to Knowi.
LLM data residency policy
Because customers configure their own LLM provider and credentials, data residency for LLM inference is determined by the provider and region the customer selects (e.g., Azure OpenAI in a specific region, OpenAI US, etc.).