I will do data cleaning, eda, machine learning
Data Cleaning EDA Machine Learning
About this Gig
Are you looking for a data expert to clean, analyze, and build machine learning models? You're in the right place!
I am an 8th-semester data science student with experience in EDA (Exploratory Data Analysis), data cleaning, and predictive modeling. I have worked on various projects related to data analytics and machine learning, which you can check on my GitHub: https://github.com/AngeloDusu
What I Offer :
- Data Cleaning & Preprocessing (handling missing values, duplicates, outliers, feature engineering, etc.)
- Exploratory Data Analysis (EDA) (visualization, statistical insights, trend identification)
- Machine Learning Model Development (Regression, Classification, Clustering)
- Hyperparameter Tuning & Model Optimization
- Clear Documentation & Code Explanation
Why Choose Me?
- Strong Python skills with Pandas, NumPy, Scikit-Learn, Matplotlib, and Seaborn
- Well-structured and readable code
- Open to custom requests to fit your specific needs
- Passionate about solving data-related problems
Expertise:
Classification
•
Churn
•
Clustering
•
Predictive analysis
Programming language:
Python
Frameworks:
Scikit-learn
APIs:
Google Cloud Vision API
•
Other
Tools:
Jupyter Notebook
•
Excel
My Portfolio
FAQ
What do you need to get started?
I need your dataset in CSV, Excel, or JSON format. Please also provide a brief description of your goals and any specific requirements you have for data cleaning, EDA, or machine learning modeling.
What tools and libraries do you use?
I use Python with libraries like Pandas, NumPy, Scikit-Learn, Matplotlib, Seaborn, and Jupyter Notebook for data processing, analysis, and machine learning model building.
What types of machine learning models can you build?
I can build various machine learning models, including Regression, Classification, and Clustering models. I also offer feature engineering and hyperparameter tuning for better model performance.
Do you provide cloud deployment or API integration?
Currently, I do not offer cloud deployment or API integration. However, I will try my best to provide a well-documented Jupyter Notebook so you can easily run and test the model on your system.
What if my dataset is too large?
For large datasets (more than 500K rows), please contact me first so we can discuss the best approach. I might need to optimize the processing steps or suggest an alternative solution.
What file formats will I receive?
You will receive: A cleaned dataset (CSV, Excel, or JSON) A Jupyter Notebook (.ipynb) file with all the code and explanations A PDF or Excel report (if included in your package)
Can I request a custom service?
Yes! If you have specific requirements that are not listed in my gig, feel free to send me a message. I am happy to discuss a custom solution tailored to your needs.
What if I need revisions?
I offer revisions depending on the package you choose. Please check the details of each package, and let me know if you need any minor adjustments after delivery.
How long will the project take?
Depending on the package that you take, it would take 3-4 days generally. For urgent requests, feel free to contact me before ordering!
