Case Studies
PXL - Open Source Social Network Platform
Project Summary
PXL is an open source social network platform for content creators. It enables users to create public or private spaces for any use such as any particular task base space or any other. Furthermore, users can take advantage of social features such as building connections, posting projects that pique the interest of other users, adding team members, notifications, project participation, and more. They can also manage their profiles and conduct a global search. This social network tool offers an online version where anyone can experience this free tool. PXL’s user interface is very logical, and users can easily navigate through various elements.
Problem Statement
The client’s requirement was to build a full-fledged backend application that can easily integrate with their prebuilt front-end application, and he later asked us to integrate the backend with the front-end.
We had to design and create a social platform where users can showcase their inventions and gain exposure. One can post any software project, categorize them, invite team members, and also participate in other projects.
Additionally, to meet the need for significant content uploads, a solution had to be developed that could easily handle the upload of media files while still being affordable and effective.
We also had to create a real-time notification system that monitors all network activity such as accepting requests, declining requests, and being removed from one’s network.
Our Solution
- With thorough testing, responsive design, and increased efficiency and performance, we concentrated on completing each task as effectively as we could.
- Based on the client’s requirements, we used S3 bucket, RDS, EC2, and flask microservice for media files and SES for emails.
- Amazon S3 was used for file hosting and data persistence.
– Amazon Relational Database Service (RDS) was used for database deployment as it simplifies the creation, operation, management, and scaling of relational databases.
– Amazon EC2 was used for code deployment because it offers a simple web service interface for scalable application deployment. - We sent emails using Amazon SES because it is a simple and cost-effective way to send and receive emails using your own email addresses and domains.
- Django-graphQL was used for the backend, and Next.js was used for the front end. Django includes a built-in object-relational mapping layer (ORM) for interacting with application data from various relational databases.
– GraphQL aims to automate backend APIs by providing type-strict query language and a single API Endpoint where you can query all information that you need and trigger mutations to send data to the backend.
– Next.js offers the best server-side rendering and static website development solutions. We utilized the flask microservice to help with high content uploads since flask upload files give your application the flexibility and efficiency to manage file uploading and serving. - Using Github’s automated CI/CD pipeline we have triggers for code lookup and deployment.
Technologies
Django-GraphQL, Next.js, PostgreSQL, AWS S3, EC2, SES and RDS
Success Metrics
- All deliverables were completed on time and exceeded expectations.
- Met all the expectations of the client and with positive feedback.
- The client was constantly updated on the status of the project.
Streamlining data processing and efficiently analyzing data through a data warehouse solution
About
A non-profit healthcare insurance provider that encountered difficulties in managing the high volume of data on their on-premises system, which impeded their capacity to analyze that data efficiently to make informed business decisions. To address these issues, they chose to migrate their data to Amazon Redshift.
Business Challenge
As their business grew, the insurance provider faced several challenges with their legacy system. Most importantly, their on-prem data warehouse, Oracle Exadata, required significant time and resources to administer, especially for large datasets. Additionally, the financial costs associated with building, maintaining, and growing self-managed, on premises data warehouses are very high.
In order to manage costs, keep ETL complexity low, and deliver acceptable performance, the customer had to constantly trade-off what data to load into the data warehouse and what data to archive in storage.
Technical Challenge
The customer’s data pipeline followed a collect, store, process/analyze, and consume model, leveraging multiple AWS services. Their data lake was created in an Amazon S3 bucket, and the data lake's stored data can be queried using dbt, utilizing AWS Glue Data Catalog Databases and Crawlers.We recommended Amazon Redshift and Redshift Spectrum to build their data warehouse. After mapping the data with Redshift Spectrum, Amazon Redshift processes the data to Redshift tables. Data is then visualized and consumed by users through Amazon QuickSight.
AWS Services Adopted for this Solution
- Amazon S3, for data lakes
- AWS Glue, as a data datalog
- Amazon Redshift, as a data warehouse
- Amazon Quicksight, for data visualization
Data Processing Solution for Healthcare Organization
Amazon Redshift's scalability and flexibility make it easy to manage expanding data volumes effortlessly. With Redshift, customers can reduce their total cost of ownership (TCO) associated with database environments. Redshift also provides a centralized and secure data storage solution, that also automatically patches and backs up the data warehouse, storing backups for a user-defined retention period. This replication and continuous backups enhance availability and improve data durability, and can automatically recover from component and node failures.
Amazon Redshift’s parallel processing and compression capabilities accelerate command execution, enabling it to operate on billions of rows simultaneously. Redshift Spectrum integrates seamlessly with other AWS services (such as Glue), making it easy to build end-to-end data pipelines. This ensures that data is always up-to-date, accurate, and secure.
“Before moving to Amazon Redshift, our engineering team was spending too much time managing our on-prem data warehouse. Now, we are saving so much time on data management, and our data analysis has improved significantly.”
- CIO, Healthcare Insurance Provider
Data Processing Results
With Amazon Redshift and additional AWS services, the customer gained a centralized and secure data storage solution, and a streamlined data analysis process. The scalability and flexibility of Redshift empowered the customer to handle expanding data volumes seamlessly.
Get started with Healthcare Data Processing
Take the first steps to improve your healthcare data processing to see increases in data security, analysis, and scalability. Contact Cloudtech today.
Mizaru- Online Platform for Specially Abled People To Get Support Services
Project Summary
Creating a Marketing website using ReactJS and AWS for the client to showcase what they do and how they do.
Feature enhancement in an existing web application where people with disabilities can request a communication facilitator or a support service provider and providers can accept a request and receive payment.
Problem Statement
The client divided the project into several MVPs.
As part of MVP-1, the client wanted to create a marketing website that is fast, secure, and allows people to understand what Mizaru is and how it can benefit them. They wanted a website that performs operations faster, is secure from the bots, and is cheaper to maintain.
MVP-2 involved enhancing the client’s existing web application, which was previously very basic. They wanted to implement features like admin dashboard management, QR code-based check-in and check-out of providers to provide service, etc.
In MVP-3 they wanted us to create a mobile application to perform the same functionality.
Our Solutions
1) We created a marketing website for the users using ReactJS. This provides us with a faster way to create and serve the application.
2) For deployment and maintenance, we used AWS. It reduced our cost and maintenance efforts.
3) For enhanced security from bots, we’ve implemented google ReCaptcha v3.
4) Once the user has a clear understanding, they are moved to a web app or a mobile App.
5) Through the web app customers (People with disability) can create a request based on their requirements (e.g. Need a communication facilitator or support service provider). Our application provides a way for people with disabilities to connect with service providers. This request will be visible to multiple service providers in the network and they can choose to accept or reject the request.
6) We integrated a payment gateway for processing the payment. Also, both customers and providers get notified of the multiple events. We created a dashboard for Admins to see the track of various requests and generate reports as per their needs.
Technologies
Express JS, React JS, Redux, AWS, GIT, Hubspot, Google Recaptcha v3
Success Metrics
- Created and delivered marketing website within the given timeframe.
- Created report generation feature for admin.
- Implementation of QR code based check-in and check-out of provider.
- Email reminders for customer and providers before service.
Mid-Market Financial Services Organization Finds Success with Event-Driven Architecture
About
This financial services organization provides financial planning, advice, and educational resources that investors need in a timely manner - including retirement planning, wealth preservation, brokerage services, and more. The organization services customers worldwide to help them realize their financial goals.
Business Challenge
The data ingestion process of a well-known financial services organization was not designed to scale to handle peak and projected future volume levels. Failures in that process could directly impact their ability to provide accurate and timely analysis, reporting, and investment advice.
Were that to happen, the ensuing reduction in assets under management and future opportunity loss would easily total tens of millions of dollars. Additionally, the data came from multiple sources in multiple formats and the whole system was governed by FINRA compliance regulations, including data encryption.
Technical Challenge
The organization needed an event-driven data processing solution that securely ingests, prepares, analyzes, and scales during peak and projected volume levels. This solution needed to handle data from multiple data sources in multiple formats, be cost-effective, and meet their FINRA compliance regulations. This solution would also allow the organization to have access to timely analysis and reporting with a comprehensive visualization of the data.
AWS Services Adopted for this Solution
- Amazon EventBridge, for data requests
- AWS Lambda, for file validation
- AWS Glue, for ETL
- Amazon Athena, for queries
- Amazon QuickSight, for data visualization
- Amazon CloudWatch and Amazon SNS for monitoring and alerts
Event-Driven Architecture Solution for Financial Services Firm
Cloudtech implemented an event-driven solution anchored by Amazon EventBridge, AWS Lambda, and AWS Glue. During non-working hours, Amazon EventBridge triggers Lambda to request new data and store it in Amazon S3. AWS Lambda also performs file validation before moving on to a AWS Glue Crawler and then AWS Glue Job for ETL to an AWS Glue Catalog, where the data is queried using Amazon Athena and visualized using Amazon QuickSight. Monitoring and alerting are handled throughout the process flow using Amazon CloudWatch and Amazon SNS. The result not only enhances scalability and accuracy, but also enables cost savings.
“Cloudtech's event-driven architecture solution is the backbone of our timely and accurate investment advice that our clients rely on.”
- VP of Engineering, Financial Services
Event-Driven Architecture Results
Cloudtech provided the organization with an event-driven data processing solution which was secure, scalable, and handled multiple data sources and data formats. In addition, the solution provided timely analysis, reporting and a comprehensive view which allowed the organization to provide accurate investment advice to their clients. In turn, this solution helped the organization avoid the potential future opportunity losses which could have cost them tens of millions of dollars.
Increase Your Data Security
Learn how you can realize the benefits of an event-driven architecture for your financial services data, to improve security, analysis, and customer experience. Contact Cloudtech today.
Enklu - Redefining Augmented Reality
Executive Summary
Enklu aims to provide an Augmented Reality (AR) runtime for UWP, WebGL, Windows Standalone, Android, and iOS. It carves a niche in the market by providing a product that is highly iterative in that it provides instant feedback to users for changes in layout, assets UI or scripts by automatically downloading new data eliminating the need to rebuild. Enklu is truly cross platform, not only does it compile flawlessly to multiple targets, it also allows for tailoring experiences to multiple platforms. Enklu employs Unity along with a C# and Node.js framework for backend to provide a web app that can help content creators create an AR VR experience. It employs React for its frontend.
Problem Statement
Most of the tech stack was deployed on azure VMs. However, they were using archaic deployment processes with a lot of manual input, coupled with poor infra planning had resulted in a high amount of downtime.
This problem was brought into sharp relief when their user base climbed tenfold. The problem was further compounded by a lack of health checks and resource monitoring. Subpar patches to this had brought the core maintenance and enhancement operations to a screeching halt.
Our Solutions
1) The first thing that we proposed to do was to move the frontend build files to S3 in order to reduce the load on the server, post which we moved on to automating the build and deployment of docker images using git actions and terraform and setting up better resource checks by employing the built-in azure triggers.
2) Next, we proposed rewriting parts of code to better handle errors and setting up node clusters with a load balancer to help reduce the load on the primary unity servers, this also helped with reducing downtime since nodes could be safely brought down without affecting the user experience during low traffic hours.
Technologies
C#, Nodejs, AWS(SQS, S3), Azure(VM and load balancer), Unity, .Net, Docker
Success Metrics
1. Reduced down time
2. Better error alerts
3. Reduced first response time (FRT) for resource hiccups
Transforming an Elasticsearch Solution to OpenSearch Service for Streamlined Business Operations
About
A global IT staffing company that serves customers in every major vertical from startups to Fortune 100 companies, was looking to modernize their business operations so they could more reliably make data-driven business decisions.
Business Challenge
The IT staffing company initially built a dual-purpose system that functioned as both a customer relationship management and enterprise resource planning tool. While custom-built for their own specifications, the existing system created some unique business challenges - specifically in the log analytics portion of the tool built on Elasticsearch.
The first challenge was price; between the Elastic licensing fees and the cost of labor associated with upkeep and maintenance, it was not a cost-effective solution. Next was reliability; the system was prone to frequent downtime and performance issues. Add to that, the business disruptions caused both internally and externally, signaled that the customer was ready for a change.
Technical Challenge
Cloudtech analyzed their data infrastructure, and determined that the current solution - self-managed ELK stack - to be suboptimal compared to the licensing fee costs. Also, the solution required eight engineers to maintain and troubleshoot.
In addition, data streams were vulnerable to disruption, and lack of shard availability and rebalancing issues caused frequent search throughput breakdowns. Lastly, upgrading the solution was tedious and time-consuming, taking up to a month in some cases.
Log Analytics Solutions
Based on the challenges, Cloudtech proposed a centralized logging platform using Amazon OpenSearch Service, AWS Fargate, Amazon Elastic Container Registry (ECR), and Amazon Elastic Kubernetes Service (EKS). This platform provided the customer with the scalability and data integrity required to store and analyze logs generated by their CRM/ERP tool. Given that cost and maintenance were important issues to address, Amazon OpenSearch Service was chosen for its speed, price, and security. Additionally, as a managed service, it alleviated the maintenance overhead associated with the previous solution.
“The cost savings and downtime reductions we experienced were far above anything we expected, and have since significantly impacted our business in a positive way. The insights we get from reliable log analytics are driving our business forward.”
- CTO, IT Staffing Company
Log Analytics Solution Results
Cloudtech’s solution enabled the customer to centralize their logs, process logs in real-time, gain insights, and make data-driven decisions. Amazon OpenSearch Service made it easier to perform search operations, enhance search efficiency, and provide scalable searching. In fact, the ability to scale up and down in minutes improves search throughput. In addition, OpenSearch Service allows the customer to schedule service software and auto-tune updates during off-peak hours, helping to better plan their deployments. Most importantly, the customer realized 40% cost reduction and 80% reduction in downtime.
Reduce Costs with Log Analytics Solution
Implement your own data processing and log analytics solution, and realize the business benefits of reduced downtime, optimized costs, and improved business decisions. Contact Cloudtech today.