The Object Store for AI Data Infrastructure
-
Updated
May 20, 2024 - Go
The Object Store for AI Data Infrastructure
DominicanWho.Codes App
Developed a robust ETL pipeline for Next Cola Pvt. Ltd data which extracts data from many different OLTP sources, converts them into dimensions and facts and load into datawarehouse for analytical workload.
🌃 프라이빗하게 추억을 공유하고 정리할 수 있는 사이트 [KKIRI] 입니다.
Selfhosted. URL to PNG utility featuring parallel rendering using Playwright for screenshots and with storage caching via Local, S3, or CouchDB
Curated list of AWS Amplify Resources
👨🏻💻 My personal website powered by Astro, Tailwind CSS and S3 🚀💨🪣
This real-time chat backend application, built with the Go Gin framework, uses PostgreSQL as the database, JWT for authentication, Redis for session management, and an S3 bucket for storing profile pictures. Socket.IO is used for real-time communication. The entire application is fully containerized using Docker.
🔗 A multipurpose Kafka Connect connector that makes it easy to parse, transform and stream any file, in any format, into Apache Kafka
Imageboard type discussion forum.
Backend part of Call-of-Project
Implementation of ML-ops pipeline for US Visa approval prediction application, further deployment on AWS Ec2 instance using Docker, CI/CD tool Github actions.
Deploy Generative AI models from Amazon SageMaker JumpStart using AWS CDK
Cyclistic Bike Ride Data - Customer Segmentation Project
This is End-To-End Data Engineering Project using Airflow and Python. In this project, we will extract data using Twitter API, use python to transform data, deploy the code on Airflow/EC2 and save the final result on Amazon S3
Add a description, image, and links to the amazon-s3 topic page so that developers can more easily learn about it.
To associate your repository with the amazon-s3 topic, visit your repo's landing page and select "manage topics."