Badges
Certifications
Work Experience
Senior Software Engineer
GlowTouch•  December 2021 - Present
Here are the main points of my experience in scraping historic market price data and processing it using AWS Lambda and Step Functions: Data scraping: I have experience in using Python to scrape historic market price data from the web. Data storage: I stored the scraped data in S3, which is a highly scalable and durable storage service provided by AWS. Data processing: I utilized AWS Lambda to process the scraped data. Lambda is a serverless computing service that allows you to run code without managing servers. Data orchestration: I designed and implemented a Step Function to orchestrate the data pipeline, including invoking the Lambda function to process the data and checking the data quality before storing it in an Athena table. Data quality: I created a data quality checker that was integrated into the Step Function to ensure the accuracy and completeness of the data. Performance optimization: I optimized the data processing pipeline for performance and scalability to handle large volumes of data efficiently. Collaboration: I worked closely with other teams, including data analysts and business stakeholders, to understand their data needs and develop solutions to meet those needs. Overall, my experience in designing and implementing end-to-end data pipelines using AWS Lambda and Step Functions, and my focus on data quality and performance optimization, make me well-suited for Senior Data Engineer roles.
Senior Software Engineer
Solution Champs Technologies•  November 2019 - December 2021
DVR AI BOX : I worked for the DVR AI BOX project as part of the core development team consisting of 14 members. As a Deep Learning Engineer, my primary responsibilities included managing all AI components and core development for the backend. I used Nvidia Jetson Nano and Jetson Xavier NS as my working platform. One of my significant achievements was creating a Recording Engine to store high-quality streaming, with support for WebRTC, automatic clip smart deletion, URL probing, and timeline information techniques. The engine also enabled exposing streaming for live playback with WebRTC using NVDEC and NVENC H/W Accelerators. I also provided support for storing the clips data into SQLite, a lightweight database for Jetson. To enable streaming, I used FFMPEG and GStreamer. I also developed an Analytics Engine that read continuous high-quality streams using GStreamer with NVDEC and applied continuous background subtraction on every frame. I used clustering logics to apply bounding boxes to find significant movement and export pre, post, and current images with metadata of bounding box details. I also trained the Yolo model from previous projects and converted it into a tiny model, which is lightweight with high performance for Jetson boards. I loaded the pre, post, and current images to Yolo darknet and found the objects in a frame and exported them to the cloud. I then showed the events with images and clips, pushing images to Amazon S3, which was useful for security alarm management boxes. I used middleware Node-red to handle all API transactions, with UI calling Node-red, which in turn called the backend engine. I used MongoDB, SQLite, and S3 for storing images. If S3 reached overload size, we backed up the data to Amazon Glazier. Frontend technologies used were Angular7 and ASP.NET. Overall, I built all the backend services and engines from scratch and communicated with networks for Onvif devices, sending email alerts to customers. Receipt Parsing with OCR: I worked as a Team Leader and Code Integrator for a project involving Receipt Parsing with OCR. My team consisted of 6 members, and we utilized various technologies such as Machine Learning, Python, Keras, TensorFlow, OpenCV, and Yolo Object Detection to achieve our goals. We developed a system to capture an image of a receipt from a mobile device and send it via a Python Flask web API. Our system performed object detection on the receipt to identify unique items such as the date, tax, total, and items using our own dataset consisting of 10,000 receipts. We annotated the dataset and trained the model with YOLOV4. Additionally, we gained expertise in text classification and transfer techniques, as well as understanding the principles of RPA and agile development methodologies. In recognition of my coordination skills within the team, I received accolades. Moreover, I was actively involved in production release and resolved a significant number of production deployment issues in record time. Recommendation Engine for E-Commerce: During my time working on a recommendation engine for an e-commerce platform, I served as the core development team leader and code integrator, working alongside a team of six members. I utilized my expertise in deep learning, machine learning, Python, C++, Keras, TensorFlow, and the Apriori algorithm to implement the engine. Specifically, we implemented the Apriori algorithm using TensorFlow in Python. Additionally, I gained valuable experience in RPA, agile methodologies, and AWS. I received recognition for my excellent coordination within the team and was actively involved in the production release, where I quickly resolved a significant number of deployment issues.
Software Engineer
SurfaceInsight•  November 2018 - December 2019
As a full stack developer, I realized the opportunities for CoreJava, python, NodeJS and Cross platform Languages and all the below mentioned applications. I achieved good level of proficiency in Java, Python REST API, NodeJS, JavaScript, Typescript, SQL, JSON and Learn good knowledge in Electron Desktop Cross platform applications implementing on RPA product whilst working for this project. Understanding RPA, what we have working, the way of work. And learn about agile techniques. I received accolades for coordination within the team. Also, I was involved in production release, fixed large number of production deployment issues in record time.
Software Engineer
Tapceipt•  October 2017 - November 2018
As an application developer, I realized the opportunities for AngularJS and Cross platform Languages and all the below mentioned applications. I achieved good level of proficiency in Java, REST API, AngularJS, JavaScript, Typescript, SQL, JSON and Learn good knowledge in IONIC Mobile Cross platform applications whilst working for this project. I received accolades for coordination within the team. Also, I was involved in production release, fixed large number of production deployment issues in record time.
Education
Bharathidasan University
Computer Science, BS•  June 2014 - April 2017