Terabytes of data migration from Amazon S3 and Mongo DB in SQL Quarriable format to Amazon Redshift. This helps the analyst folks to do there data collection part. The Architecture involves importing real time source data using Spark streaming into S3. Once the file lands at S3 a message is populated in AWS SQS and an event is triggered which parses and cleans the files using Python. The parsed data is then dumped into snowflake and Redshift and an email is then sent to client using SES.