Anas Tarek

Cloud-Focused Developer

Site Visitors: Loading...

About

Anas Tarek

Cloud Technology Enthusiast and Future Innovator

I am a certified cloud enthusiast with credentials from AWS and Huawei. My academic journey has provided me with a strong foundation and hands-on experience in cloud infrastructure, automation, and deployment through a variety of projects. I am eager to apply my technical skills in a real-world environment and am seeking an entry-level cloud-focused role where I can contribute, learn, and grow.

Tools

Here are some of the skills and tools that I am familiar with and continue to improve on.

AWS

AWS

Amazon Web Services is a cloud computing platform by Amazon, offering services like EC2, S3, and Lambda.

Huawei Cloud

Huawei Cloud

Huawei Cloud provides a comprehensive cloud computing platform with services for infrastructure and AI.

Flutter

Flutter

Flutter is an open-source UI framework by Google for building natively compiled applications.

Python

Python

Python is a versatile programming language used for automation, data analysis, and cloud scripting.

Git

Git/GitHub

Git is a distributed version control system. GitHub allows hosting and collaborating on Git repos.

Notion

Notion

Notion is a productivity tool for task management, note-taking, and project planning.

Terraform

Terraform

Terraform is an open-source infrastructure as code software tool for provisioning cloud resources.

Resume

Work Experience

Artificial Intelligence Intern

Aug 2024 - Sep 2024 · 2 mos

Telecom Egypt · Internship · Alexandria, Egypt · On-site

Education

Bachelor's Degree in Software Industry & Multimedia

Present

Faculty of Science, Alexandria University

I'm a student at the Faculty of Science in the Software Industry and Multimedia department in Alexandria University | CGPA (3.5/4)

Certificates

Throughout my time in University and with self-study I was able to achieve some of the following credentials. Feel free to verify my certifications or view my badges on Credly.

AWS Certified Cloud Practitioner

AWS Certified Cloud Practitioner

aws-cloud-quest-cloud-practitioner

AWS Cloud Quest Cloud Practitioner Badge

aws-educate-web-builder

AWS Educate Web Builder Badge

HCCDA-Tech Essentials

Huawei Certified Developer Associate

Certificate ID: HWENDCTEDA008864

Projects

Explore some of the hands-on projects I've worked on to showcase my skills in cloud computing, AI/ML, and software development.

Cloud Resume Challenge

June 2025 - Present

Technologies: AWS (S3, CloudFront, Lambda, API Gateway, DynamoDB, Route 53), Terraform, GitHub Actions, HTML, CSS, JavaScript

  • Developed and deployed a static resume website built with HTML, CSS, and JavaScript.
  • Hosted the static site on AWS S3 and served it globally with low latency using AWS CloudFront as a CDN.
  • Configured Amazon Route 53 to manage a custom domain, directing DNS queries to the CloudFront distribution.
  • Built a serverless backend using AWS Lambda and API Gateway to handle a visitor counter feature.
  • Utilized Amazon DynamoDB to store and retrieve the website's visitor count.
  • Automated the entire infrastructure provisioning process using Terraform for Infrastructure as Code (IaC).
  • Implemented a CI/CD pipeline using GitHub Actions to automatically deploy changes to the frontend and backend upon code commits.

Soul Support

2024-2025

Technologies: Flutter, Dart, Python, TensorFlow, Keras, Scikit-learn, NumPy, JSON, Natural Language Processing (NLP)

  • Front-End Development: Architected and developed the entire patient registration system from the ground up using Flutter. This involved creating a seamless, user-friendly interface for patient onboarding, ensuring a smooth and intuitive user experience.
  • Chatbot Integration: Designed and developed "Soul Mate," a conversational AI to serve as a personal therapeutic assistant on the Soul Support platform.
  • AI & Machine Learning: Sourced and structured a comprehensive dataset of intents, patterns, and responses in a JSON file to define the chatbot's knowledge base on topics ranging from emotional states (sadness, anxiety) to factual mental health information.
  • Built, trained, and deployed a deep learning model for Natural Language Understanding using Python with TensorFlow and Keras libraries.
  • Engineered a data preprocessing pipeline to vectorize text data using the Keras Tokenizer and encode labels with Scikit-learn.
  • Implemented and trained a sequential neural network with an Embedding layer, achieving high accuracy in classifying user intent after 550 epochs of training.
  • Developed a Python-based interactive interface that uses the saved model to predict user intent and generate contextually appropriate, empathetic responses.

AWS Multi-Tier VPC Architecture

August 2024

Technologies: AWS VPC, Subnets, Internet Gateway, NAT Gateway, Route Tables

  • VPC Creation: Created a custom VPC with a specified IPv4 CIDR block (e.g., 10.0.0.0/16).
  • Subnet Creation: Designed and implemented both public and private subnets across different availability zones to ensure high availability and fault tolerance.
  • Internet Gateway: Attached an Internet Gateway to the VPC to enable internet access for the public subnet.
  • NAT Gateway: Deployed a NAT Gateway in the public subnet to allow instances in the private subnet to securely access the internet.
  • Route Tables: Configured route tables for public and private subnets to control the flow of traffic.

Semantic Segmentation with Amazon SageMaker

2024

Technologies: Amazon SageMaker, S3, IIIT-Oxford Pets Dataset, Machine Learning

  • Data Preparation: Downloaded and organized the IIIT-Oxford Pets Dataset for use with SageMaker's semantic segmentation algorithm. Prepared the data with the correct folder structure, splitting it into training and validation sets.
  • Notebook Instance: Created and configured a SageMaker notebook instance to handle model training and deployment.
  • Data Upload to S3: Set up an S3 bucket and uploaded the dataset for access by the SageMaker environment.
  • Estimator and Hyperparameters: Created a SageMaker estimator, specifying training instances and setting up hyperparameters for semantic segmentation.
  • Model Training and Deployment: Trained a semantic segmentation model using SageMaker's built-in algorithms. Deployed the model to an endpoint for real-time inference.
  • Inference and Endpoint Management: Conducted inference with the deployed model and cleaned up resources by deleting the endpoint post-inference.