TensorFlow 선형회귀

Jmnote (토론 | 기여)님의 2017년 12월 6일 (수) 16:32 판 (새 문서: ==개요== ;GradientDescentOptimizer <source lang='python'> import tensorflow as tf x_train = [1,2,3] y_train = [1,2,3] W = tf.Variable(tf.random_normal([1]), name='weight') b = t...)
(차이) ← 이전 판 | 최신판 (차이) | 다음 판 → (차이)

개요

GradientDescentOptimizer
import tensorflow as tf

x_train = [1,2,3]
y_train = [1,2,3]

W = tf.Variable(tf.random_normal([1]), name='weight')
b = tf.Variable(tf.random_normal([1]), name='bias')

hypothesis = x_train * W + b

cost = tf.reduce_mean(tf.square(hypothesis - y_train))
train = tf.train.GradientDescentOptimizer(learning_rate=0.01).minimize(cost)

sess = tf.Session()
sess.run(tf.global_variables_initializer())

for step in range(2001):
    sess.run(train)
    if step % 200 == 0:
        print(step, sess.run(cost), sess.run(W), sess.run(b))
실행결과
0 10.3343 [-0.58280057] [ 0.22211805]
200 0.0324351 [ 0.79082805] [ 0.47549695]
400 0.0123853 [ 0.87074447] [ 0.29382828]
600 0.00472931 [ 0.92012799] [ 0.18156797]
800 0.00180589 [ 0.9506439] [ 0.11219799]
1000 0.000689574 [ 0.96950084] [ 0.06933162]
1200 0.000263314 [ 0.98115343] [ 0.04284282]
1400 0.000100546 [ 0.98835391] [ 0.02647423]
1600 3.83936e-05 [ 0.99280345] [ 0.01635946]
1800 1.4661e-05 [ 0.99555296] [ 0.01010923]
2000 5.59837e-06 [ 0.99725193] [ 0.00624691]
문서 댓글 ({{ doc_comments.length }})
{{ comment.name }} {{ comment.created | snstime }}