"TensorFlow 단순회귀분석"의 두 판 사이의 차이

3번째 줄: 3번째 줄:
* 텐서플로우는 '통계분석 패키지'라기 보다는 '범용모델 학습 라이브러리'이므로 접근방법이 상당히 다르다.
* 텐서플로우는 '통계분석 패키지'라기 보다는 '범용모델 학습 라이브러리'이므로 접근방법이 상당히 다르다.


==예시 1==
<source lang='python'>
import tensorflow as tf
x_data = [1,2,3]
y_data = [1,2,3]
learning_rate = 0.05
W = tf.Variable(tf.random_uniform([1], -1.0, 1.0))
b = tf.Variable(tf.zeros([1]))
y = W * x_data + b
loss = tf.reduce_mean(tf.square(y - y_data))
train = tf.train.GradientDescentOptimizer(learning_rate).minimize(loss)
sess = tf.Session()
sess.run( tf.global_variables_initializer() )
for step in range(2001):
    sess.run(train)
    if step % 200 == 0:
        print(step, sess.run(W), sess.run(b))
# 0 [ 0.02381748] [ 0.36606845]
# 200 [ 0.97388893] [ 0.05935668]
# 400 [ 0.99767464] [ 0.00528606]
# 600 [ 0.99979293] [ 0.00047073]
# 800 [ 0.99998158] [  4.19568278e-05]
# 1000 [ 0.99999833] [  3.76215985e-06]
# 1200 [ 0.99999964] [  7.06425851e-07]
# 1400 [ 0.99999964] [  6.58742181e-07]
# 1600 [ 0.99999964] [  6.58742181e-07]
# 1800 [ 0.99999964] [  6.58742181e-07]
# 2000 [ 0.99999964] [  6.58742181e-07]
</source>
==예시 2==
<source lang='python'>
<source lang='python'>
import tensorflow as tf
import tensorflow as tf

2017년 12월 21일 (목) 10:09 판

1 개요

TensorFlow 단순선형회귀분석
  • 텐서플로우는 '통계분석 패키지'라기 보다는 '범용모델 학습 라이브러리'이므로 접근방법이 상당히 다르다.

2 예시 1

import tensorflow as tf

x_data = [1,2,3]
y_data = [1,2,3]
learning_rate = 0.05

W = tf.Variable(tf.random_uniform([1], -1.0, 1.0))
b = tf.Variable(tf.zeros([1]))
y = W * x_data + b

loss = tf.reduce_mean(tf.square(y - y_data))
train = tf.train.GradientDescentOptimizer(learning_rate).minimize(loss)

sess = tf.Session()
sess.run( tf.global_variables_initializer() )

for step in range(2001):
    sess.run(train)
    if step % 200 == 0:
        print(step, sess.run(W), sess.run(b))
# 0 [ 0.02381748] [ 0.36606845]
# 200 [ 0.97388893] [ 0.05935668]
# 400 [ 0.99767464] [ 0.00528606]
# 600 [ 0.99979293] [ 0.00047073]
# 800 [ 0.99998158] [  4.19568278e-05]
# 1000 [ 0.99999833] [  3.76215985e-06]
# 1200 [ 0.99999964] [  7.06425851e-07]
# 1400 [ 0.99999964] [  6.58742181e-07]
# 1600 [ 0.99999964] [  6.58742181e-07]
# 1800 [ 0.99999964] [  6.58742181e-07]
# 2000 [ 0.99999964] [  6.58742181e-07]

3 예시 2

import tensorflow as tf

x_data = [1.47, 1.50, 1.52, 1.55, 1.57, 1.60, 1.63, 1.65, 1.68, 1.70, 1.73, 1.75, 1.78, 1.80, 1.83]
y_data = [52.21, 53.12, 54.48, 55.84, 57.20, 58.57, 59.93, 61.29, 63.11, 64.47, 66.28, 68.10, 69.92, 72.19, 74.46]
learning_rate = 0.02

W = tf.Variable(0.0)
b = tf.Variable(0.0)
y = W * x_data + b
cost = tf.reduce_mean(tf.square(y - y_data))
train = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)

sess = tf.Session()
sess.run( tf.global_variables_initializer() )

for step in range(80001):
    sess.run(train)
    if step % 10000 == 0:
        print( "step=", step, "W=", sess.run(W), "b=", sess.run(b) )
        
# step= 0 W= 4.12865 b= 2.48312
# step= 10000 W= 52.1455 b= -23.9476
# step= 20000 W= 58.7961 b= -34.9614
# step= 30000 W= 60.6001 b= -37.9489
# step= 40000 W= 61.0897 b= -38.7597
# step= 50000 W= 61.2225 b= -38.9796
# step= 60000 W= 61.2588 b= -39.0398
# step= 70000 W= 61.2619 b= -39.0449
# step= 80000 W= 61.2619 b= -39.0449

4 같이 보기

문서 댓글 ({{ doc_comments.length }})
{{ comment.name }} {{ comment.created | snstime }}