Graphic design python turtle 🐢

from turtle import * import colorsys bgcolor('black') pensize(0) tracer(50) h=0 for i in range(300): c=colorsys.hsv_to_rgb(h,1,1) h+=0.9 color(c) forward(300) left(100) fd(i) goto(0,0) down() rt(90) begin_fill() circle(0) end_fill() rt(10) for j in range(5): rt(30) done() Please follow my blog and subscribe my channel for more videos and newly updates 👍👍👍👍👍 import turtle as t import colorsys t.bgcolor('black') t.tracer(100) h=0.4 def draw(ang,n): t.circle(5+n,60) t.left(ang) t.circle(5+n,60) for i in range(200): c=colorsys.hsv_to_rgb(h,1,1) h+=0.005 t.color(c) t.pensize(2) draw(90,i*2) draw(120,i*2.5) draw()

Regression in Machine Learning

 

Regression

Regression models are used to predict a continuous value. Predicting prices of a house given the features of house like size, price etc. is one  of the common examples of regression. It is a supervised technique.

Types of Regression:-

  •  Simple Linear Regression 
  • Polynomial regression
  • Support Vector Regression
  •  Decision Tree Regression
  •  Random Forest Regression



1.       Simple Linear Regression:-  This  si one of most common and interesting type of regression technique. Here we predict a target variable Y based on the input variable X. A linear relationship should exist between target variable and predictor and so comes the name linear regression.

2.       Polynomial Regression:-  In polynomial regression, we transform the original features into polynomial features of a given degree and then apply linear regression on it.

3.       Support Vector Regression:-  In SVR, we identify a hyper plane with maximum margin such that maximum numbers of data points are within that margin. SVRs are almost similar to SVM classification algorithm.

4.       Decision Tree Regression :-  Decision tree can be used for classification as well as regression. In decision trees, at each level we need to identify the splitting attribute.

5.       Random Forest Regression:-  Random forest is an ensemble approach where we take into account the predictions of several decision regression trees.

1)      Select k random points.

2)      Identify n where n is the number of decision tree regressors to be created. Repeat step 1 and 2 to create several regression trees.

3)      The average of each branch is assigned to leaf node in each decision tree.

4)      To predict output for a variable, the average of all the predictions of all decision trees are taken into consideration.

Comments

Popular posts from this blog

Protocol verification using finite state machine model in computer network

Make fire crackers using python turtle 🐢

Scope and Limitation of Machine Learning