Machine Learning Basics with Python
Begin Your AI Journey with DARCHUMSTECH ๐
๐ Introduction
Machine Learning teaches computers to learn from data without explicit programming. Python is the king of ML because it's simple, powerful, and has tons of ready-to-use libraries.
๐ ️ Requirements
- Python 3.x installed
- Libraries: numpy, pandas, matplotlib, scikit-learn
- Basic Python skills
pip install numpy pandas matplotlib scikit-learn
๐ฅ Step 1: Load the Dataset
import numpy as np import pandas as pd from sklearn.datasets import load_iris iris = load_iris() X = iris.data y = iris.target
๐ Step 2: Explore the Data
df = pd.DataFrame(X, columns=iris.feature_names) df['species'] = iris.target_names[y] print(df.head())
๐ Step 3: Visualize with Matplotlib
import matplotlib.pyplot as plt plt.scatter(df.iloc[:, 0], df.iloc[:, 1], c=y, cmap='plasma') plt.xlabel(iris.feature_names[0]) plt.ylabel(iris.feature_names[1]) plt.title("Iris Flower Dataset") plt.show()
✂️ Step 4: Split & Scale Data
from sklearn.model_selection import train_test_split from sklearn.preprocessing import StandardScaler X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, random_state=42) scaler = StandardScaler() X_train = scaler.fit_transform(X_train) X_test = scaler.transform(X_test)
๐ค Step 5: Train the Model
from sklearn.neighbors import KNeighborsClassifier model = KNeighborsClassifier(n_neighbors=5) model.fit(X_train, y_train)
๐งช Step 6: Evaluate the Model
from sklearn.metrics import accuracy_score, confusion_matrix, classification_report y_pred = model.predict(X_test) print("Accuracy:", accuracy_score(y_test, y_pred)) print("Confusion Matrix:\n", confusion_matrix(y_test, y_pred)) print("Report:\n", classification_report(y_test, y_pred))
๐ข Note: Try adjusting
n_neighbors
to improve your model’s performance!
๐ก Key ML Concepts
- Supervised Learning: Learn with labeled data
- Unsupervised Learning: Discover hidden patterns
- Overfitting: Model too good on training, bad on real data
- Underfitting: Model too simple, bad on both
๐ฏ Challenge:
Replace k-NN with a Decision Tree and compare accuracy!
Comments
Post a Comment