The client being a SaaS-based platform who works with US healthcare institutes to deliver an improved elderly care experience, was looking for a technology partner that could help them set up a continuous delivery pipeline that fully complies with HIPAA security guidelines. They needed a platform that could handle high volumes of data and ensure safe processing of that data.
Cloud and DevOps
Established in 2007, the customer is one of the leading SaaS-based platforms that enables US healthcare institutes to deliver an improved elderly care experience while adding to their business growth.
The software is currently being used by over 12000 institutes every day and aims to establish a more collaborative and data-backed approach to address the challenges faced by the elderly all across the country.
The patient data on the platform is collected from medical institutes around North America. As the company deals directly with the sensitive personal health information (PHI) of patients, its cloud infrastructure must comply with the Health Insurance Portability and Accountability Act (HIPAA) and all the latest security guidelines defined under it.
The client was looking for a technology partner that could help them set up a continuous delivery pipeline that fully complies with HIPAA security guidelines. They partnered with Velotio considering our proven expertise in DevOps services as well as building HIPAA-compliant architectures.
Velotio’s Cloud team is unparalleled in their vast knowledge of the AWS Cloud ecosystem. Their resources have accelerated delivery and brought immense value to the team.
Velotio’s team of experienced Solution Architects designed and implemented a fault-tolerant architecture that fully complied with all HIPAA requirements and ensured the safe processing of patient data on the analytics platform.
The DevOps experts helped build an automated delivery pipeline with clear segregation of duty policies. The discovery mechanism was built to configure the overall stack, automate DNS management, relevant secrets management, et al. The pipeline also enabled integration with the customer’s release process, making it faster for the development team to iterate and push frequent releases, while the operations team took responsibility for the final validation and updation to the production environment.
The team isolated all the services inside the network but ensured a secure connection with the customer's production data center. This facilitated the data loading and ingestion, and enabled single sign-on with other web application services used by the customer.
The platform can now easily handle high volumes of data and scale to meet the demands of a Hadoop-based ingestive service, such as higher processing speed and low latency data replication.
The team combined multiple solutions to achieve this:
The team combined Chef with AWS tools to implement complete automation and utilized the DevOps expertise to build a powerful continuous delivery pipeline, which is now used by both Development and Operations teams.
After building the continuous delivery pipeline, Velotio’s team worked with the client’s in-house team to automate bootstrapping and resource provisioning by combining Chef-based provisioning with AWS CloudFormation. This helped us build a fully-automated infrastructure that is managed as a code and can be used to build additional sandbox, staging, pre-production environments. This could also be used to rebuild the main production environment in the event of a disaster or in the event that moving to another AWS region is needed.
Implemented self-healing, fault-tolerant services like site-to-site and host-to-host VPN or NAT gateways across multiple Availability Zones. Also set up an exclusive, fully-managed DNS that supports reverse host lookup using Route 53. This is shared across VPNs to ensure easy access to integrated services.
Implemented one-way connectivity to shared resources outside the network to deploy packages and cookbooks easily from the production network.
Ensured that the discovery mechanism can classify parameters to handle application configuration across multiple environments.
Ensured appropriate in-transit encryption across all devices to provide reliable and secure delivery of SSL key stores and keys etc. to all nodes.