Back to resources

Share on

Improvability Data Security Overview

Improvability Data Security Overview

Improvability Data Security Overview

June 2024

Learn how Improvability AI and our data processors keep your data secure.

Data Security

Introduction
Privacy and security are two of the most important aspects at Improvability AI. A client needs to know that their data is safe and secure within the confines of another company’s storage and services. We will discuss the various pieces of Improvability AI and how each handles security and privacy.


Secure Socket Layer
Due to the nature of the modern web, communication over the internet between clients, servers, sockets, websites, services, etc. is encrypted via the Secure Socket Layer (SSL) protocol en route, making the data inaccessible and greatly reducing the chance of man-in-the-middle attacks. As a result, in this document we are more concerned with data at rest than data in transit.


The Pieces of Improvability AI
There are places in the Improvability AI software suite where privacy and security are most important. We have written our own software, of course. But we also use third- party services for some functionality. Amongst these are a vector database and a Large Language Model (LLM).


The Vector Database on Pinecone
As of this writing, Improvability AI uses a vector database provided by Pinecone. Pinecone provides a hosted database that is accessed by our software via an API. A vector database is used in the context of a chatbot to provide information related to a question asked by the end user. Pinecone holds this potentially proprietary information on its servers so the security of their servers is extremely important.

On the Pinecone website in their Security and Privacy section they have this notice about compliance to various standards:


Routine third-party security reviews ensure we're in compliance with the latest industry standards. We are SOC2 Type II certified, HIPAA compliant, and GDPR-ready.


As well, they explain the actions they take to maintain the privacy and security of data in their system.

Built-in data safeguards mean your data is always protected. We keep your data stored in isolated containers and encrypted at rest and in transit. We never use your data other than servicing API calls.


On their Trust and Security Center page they make available documents and reports that go into great detail about their efforts in this area. All of this indicates that Pinecone takes security quite seriously and is very careful with data stored in their system.

The Large Language Model on OpenAI
As of this writing, Improvability AI uses a LLM offered by a company called OpenAI. OpenAI provides access to a LLM called GPT-4, which our software access via an API. On the OpenAI website in their Enterprise privacy at OpenAI page the company lays out its policies around this subject. They are CCPA, GDPR, SOC2, and SOC3 compliant. As well, they claim that incoming data via the API is not used to further train any of their LLMs. Similarly to Pinecone, they provide access to documents that detail their compliance and efforts to ensure the security and privacy of customer data.


Improvability AI hosted on AWS

The Improvability AI chatbot is hosted on and runs within the Amazon Web Services (AWS) cloud. Being the largest and most pervasive cloud provider, AWS has very extensive data privacy and security policies, standards, and compliance with international law. Their compliance programs are outlined on their AWS Compliance Programs page. They too are SOC1, SOC2, and SOC3 compliant as well as adhering to dozens more international standards.


Conclusion

The preceding illustrates that all aspects of the Improvability AI software are safe, secure, and private. The only time data passes to a third party is when a question is asked of the Improvability AI chatbot which retrieves data from the vector database and passes that data along to the LLM as part of the context provided to allow the LLM to formulate an appropriate answer. This is unavoidable in the current configuration but may be mitigated in future by moving to a private instance of a LLM.

Begin automating your reporting and expert knowledge sharing.

Book a demo

© 2024 Improvability AI. All rights reserved.

Begin automating your reporting and expert knowledge sharing.

Book a demo

© 2024 Improvability AI. All rights reserved.