Dive into Deep Learning

 
Linear
 
Neural Networks/
LeeSaeBom
 
 
 
 
 
 
Linear Regression
Softmax Regression
 
Linear Regression
  
: 
어떠한 
x
값에 따른 
y 
값을 알고 있을 때
,
    
주어지지 않은 새로운 
x
값을 통해 
y
값을 유추하는 것
 
  Ex > 
x [ 1, 2, 3 ]         
Q?    
x=4 
일 때
, y 
값은
?    
A : 
“9”
         
y [ 3, 5, 7 ]
 
가설 
: H (W,b) = Wx +b
 
    초기화 
: W=1, b=0
    
목표 
: y = 2x +1
 
 
 
 
 
 
 
 
가설이 목표에서 얼만큼 잘못되었는지 알려주는 함수
 : 
Cost Function
 
최소 제곱의 값이 크다 
= 
목표와 멀다 
= 
가설 
W, b 
값이 많이 잘못됬다
 
 
 
최소 제곱의 값이 
‘0’
에 수렴한다 
= 
목표와 비슷하다
 
cf . 
최소 제곱법을 사용하는 이유
  1. 
비용 
Up = 
패널티  
Up :
 잘못됨을 빨리 인식하여 학습속도  
Up
 
  2. 
내부적 절대값은 조건문을 사용하기 때문에 연산 속도가 저하됨
 
w
 
Cost
 
b
 
Cost
 
global optimum
 
global optimum
 
Gradient Descent Algorithm
    
 : 
Cost
 
Function
을 최적화 하는 알고리즘으로 
global optimum
을 찾는다
 
 
How to get the “W” ?  
 
미분
 
 
Linear
 
Regression in Machine Learning
   (Gradient Descent Algorithm)
 
 
Y
 
N
 
Linear Regression Examples
 
1
 
Sigmoid Function
 
H (W,b) = Wx +b   ->  binary classification 
적합하지 않음
 
 
 
 
 
.
 
Sigmoid Function
 
Softmax Regression
 
S
oftmax
 
.
 
        logit                                  probability
 
.
 
.
 
높은 값을 갖는 
0.879 =: 1,  
낮은 값을 갖는 
0.119 , 0.002 =: 0 
으로 분류 가능
Slide Note
Embed
Share

Discover the foundations and applications of linear neural networks, linear regression, and softmax regression in the realm of deep learning. Explore the principles, techniques, and advancements in these areas to enhance your understanding and proficiency in the field of artificial intelligence.

  • Deep Learning
  • Neural Networks
  • Linear Regression
  • Softmax Regression
  • Artificial Intelligence

Uploaded on Feb 20, 2025 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Dive into Deep Learning Linear Neural Networks Linear Neural Networks/ LeeSaeBom Linear Regression Softmax Regression

  2. Linear Regression Dive ito Deep Learning Linear Neural Networks : x y , x y Ex > x [ 1, 2, 3 ] Q? x=4 , y ? A : 9 y [ 3, 5, 7 ] Linear Regression Softmax Regression

  3. : H (W,b) = Wx +b Dive ito Deep Learning Linear Neural Networks : W=1, b=0 : y = 2x +1 : Cost Function Ex> ??+ ??+ ?? ?? ? = ? = = W, b Linear Regression Softmax Regression 0 =

  4. Dive ito Deep Learning Linear Neural Networks ? ? Cost(W,b) = ??? ?(??? + ? ??)? ?=? ?,? cf . 1. Up = Up : Up 2. ?( ) ???? ???????? Cost Cost Linear Regression Softmax Regression w b global optimum global optimum

  5. Gradient Descent Algorithm Dive ito Deep Learning Linear Neural Networks : Cost Function global optimum How to get the W ? ? ? Cost(W,b) = ??? ?(??? + ? ??)? ?=? ?,? ?(??2?2+ 2???? 2??? 2????? + ?2+ ??2) ? + ??= ?=1 1. 1 ? 2. W = ? cos ? ?,? = ?=1 ?(2??2? + 2??? 2????2) ?? Linear Regression Softmax Regression 1 ? 3. b = ? cos ? ?,? = ?=1 ?(2??? 2?? + 2?) ??

  6. Linear Regression in Machine Learning (Gradient Descent Algorithm) Dive ito Deep Learning Linear Neural Networks ? cos ? ?,? ?? N Y W := W - ? b := b - ? ? cos ? ?,? ?? ? (learning rate) : ex > 0.001 Data set : data set batch size (= mini batch) Linear Regression Softmax Regression

  7. Linear Regression Examples Dive ito Deep Learning Linear Neural Networks Linear Regression Softmax Regression

  8. ? : Sigmoid Function Dive into Deep Learning Linear Neural Networks ?+ ? ? H (W,b) = Wx +b -> binary classification 1 Linear Regression Softmax Regression

  9. Softmax Regression Dive into Deep Learning Linear Neural Networks ( < 1) . ? ?+ ? ? Sigmoid Function ??? ??? Linear Regression Softmax Regression Softmax ( =1)

  10. Logit : [0,1] [- , ] . Dive into Deep Learning Linear Neural Networks -> Logit softmax . p : [ 0,1 ] Odds(p) : [ 0, ] log(Odds(p)) : [ - , ] [ - to + ] [ 0 to 1.0 ] Linear Regression Softmax Regression logit probability

  11. Softmax : ??????= ??? Dive into Deep Learning Linear Neural Networks . ??? ?????? Linear Regression Softmax Regression 0.879 =: 1, 0.119 , 0.002 =: 0

More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#