簡單了解AlphaGo 同今日既AI

394 回覆
615 Like 13 Dislike
2017-06-10 09:56:05
有冇巴打可以講吓back propagation,k-nearest neighbor,support vector machine同decision tree,我ee仔冇ml background但依幾日要知佢做緊乜


Backpropagation即係一個Algorithm去計 neural network入面parameter,原理係用network output result個error 去層層逆推計parameter以達致最適error,行完結果係得到一個neural network,我諗做分類
k-nearest neighbor (K-means)做data分類,原理係每粒data附近k粒最近既data應該大部分係同類,行完結果係一堆分好類既data
Support vector machine (SVM)做data分類,原理係用data投影喺條界上既位置計距離,行完結果係得到一條界cut係兩邊data既正中間
Decision tree我諗指分類 learning嗰隻,原理係逐個feature起個條件分類,得到一個跟條件既分類法,例子如下: 你身高夠唔夠180cm? 夠>你會考有無30分? 有>你賓周有無30cm長? 有>...諸如此類...>你係高登仔


利申: 其實我唔太識,係上網自學,我都無background,不過見好似無人答你先講兩句,有咩唔明大家研究下,下面係一啲舊年上coursera堂送既reference

Backpropagation Algorithm
http://ufldl.stanford.edu/wiki/index.php/Backpropagation_Algorithm
An Idiot’s guide to Support vector machines (SVMs)
http://web.mit.edu/6.034/wwwbob/svm-notes-long-08.pdf

Bp講緊由錯誤中學習 用自定既error measure(最簡單就係euclidean distance)去improve返neural network, 如果高既learning rate就會學曬呢一次既野 而冇左好多以前學左既野

本意係minimize error, geometrically 睇就係要去到error curve既minimum point, 所以涉及patial d去搵呢個pt, 好似dse curve sketching搵minimum pt咁

如果你有d numerical analysis底就可以理解為step size太大 去唔到optimal point
As a signal黎講就係sample得太疏 唔會知道真正既最低點(唔係100%岩 只係類近d既講法)

K-mean多數係做clustering 將最近(呢個distance又係要自己define, 多數係euclidean)既痴埋一齊

SVM多數係用黎做classification 要睇你本身用咩kernel, 冇就係linear, 搵一條optimal既直線分開兩個group


係喎,我係咪都叫分類,講得唔清楚
分類有分有監督同無監督,有監即係手頭上既sample一早已經知邊個打邊類,用現有預估將來,無監即係而家唔知邊個打邊個,靠估分返一群群

巴打可唔可以講埋Decision tree learning

clustering中文叫聚類,classification中文叫分類,前者係unsupervised無監督,後者係supervised有監督。k-means係unsupervised做clustering,k-nearest neighbors係supervised做classification,應該係咁樣
2017-06-10 10:18:40
有冇巴打可以講吓back propagation,k-nearest neighbor,support vector machine同decision tree,我ee仔冇ml background但依幾日要知佢做緊乜


Backpropagation即係一個Algorithm去計 neural network入面parameter,原理係用network output result個error 去層層逆推計parameter以達致最適error,行完結果係得到一個neural network,我諗做分類
k-nearest neighbor (K-means)做data分類,原理係每粒data附近k粒最近既data應該大部分係同類,行完結果係一堆分好類既data
Support vector machine (SVM)做data分類,原理係用data投影喺條界上既位置計距離,行完結果係得到一條界cut係兩邊data既正中間
Decision tree我諗指分類 learning嗰隻,原理係逐個feature起個條件分類,得到一個跟條件既分類法,例子如下: 你身高夠唔夠180cm? 夠>你會考有無30分? 有>你賓周有無30cm長? 有>...諸如此類...>你係高登仔


利申: 其實我唔太識,係上網自學,我都無background,不過見好似無人答你先講兩句,有咩唔明大家研究下,下面係一啲舊年上coursera堂送既reference

Backpropagation Algorithm
http://ufldl.stanford.edu/wiki/index.php/Backpropagation_Algorithm
An Idiot’s guide to Support vector machines (SVMs)
http://web.mit.edu/6.034/wwwbob/svm-notes-long-08.pdf

Bp講緊由錯誤中學習 用自定既error measure(最簡單就係euclidean distance)去improve返neural network, 如果高既learning rate就會學曬呢一次既野 而冇左好多以前學左既野

本意係minimize error, geometrically 睇就係要去到error curve既minimum point, 所以涉及patial d去搵呢個pt, 好似dse curve sketching搵minimum pt咁

如果你有d numerical analysis底就可以理解為step size太大 去唔到optimal point
As a signal黎講就係sample得太疏 唔會知道真正既最低點(唔係100%岩 只係類近d既講法)

K-mean多數係做clustering 將最近(呢個distance又係要自己define, 多數係euclidean)既痴埋一齊

SVM多數係用黎做classification 要睇你本身用咩kernel, 冇就係linear, 搵一條optimal既直線分開兩個group


係喎,我係咪都叫分類,講得唔清楚
分類有分有監督同無監督,有監即係手頭上既sample一早已經知邊個打邊類,用現有預估將來,無監即係而家唔知邊個打邊個,靠估分返一群群

巴打可唔可以講埋Decision tree learning

clustering中文叫聚類,classification中文叫分類,前者係unsupervised無監督,後者係supervised有監督。k-means係unsupervised做clustering,k-nearest neighbors係supervised做classification,應該係咁樣


k-nearest neighbors唔係同 k-means一樣咩? 爭個supervised vs unsupervised?

兩個algorithms完全唔同,唯一相同就係大家都有個k,supervised vs unsupervised已經差好遠

你google睇下,同埋試下落手做了解兩個algorithms既每個steps,或者睇下concrete examples
2017-06-10 13:47:23
救命呀依家要睇doctoral thesis要依2,3日學依4樣野,仲要冇ml底,本身唔係讀cs
2017-06-10 15:07:45
巴打禮拜六都起到身,我都想乘機問下euclidean distance係咪即係L2 norm? 一時一個名好混亂,幾時用邊個名?


norm係linear algebra入面嘅concept
其實用邊個都無所謂

咁鬼多名,以前中學做數一日做到黑邊有咁多英文名

大把名,例如仲有l2 distance, l2 metric, 2 norm, 2 distance, standard metric, standard norm, euclidean norm, euclidean metric, euclidean length,...
全部同一樣嘢嚟

norm (length of a vector)同 metric (distance between 2 points)唔同架喎

只係 norm 有 associated metric
2017-06-10 16:31:35
救命呀依家要睇doctoral thesis要依2,3日學依4樣野,仲要冇ml底,本身唔係讀cs

果d係applied math黎..
2017-06-10 16:45:57
救命呀依家要睇doctoral thesis要依2,3日學依4樣野,仲要冇ml底,本身唔係讀cs

果d係applied math黎..

applied math定cs都好,我ee仔對住份thesis想死,都唔知佢做緊乜叉
2017-06-12 14:04:06
救命呀依家要睇doctoral thesis要依2,3日學依4樣野,仲要冇ml底,本身唔係讀cs

果d係applied math黎..

applied math定cs都好,我ee仔對住份thesis想死,都唔知佢做緊乜叉

2017-06-13 15:56:59
救命呀依家要睇doctoral thesis要依2,3日學依4樣野,仲要冇ml底,本身唔係讀cs

過左三日,而家點?

唔理佢住,大約知佢做乜就算
2017-06-13 20:34:49
LM
2017-06-13 20:37:44
lm
2017-06-13 20:50:51
有咩基本math要識

最想明背後個數學原理
2017-06-13 20:56:14
有咩基本math要識

最想明背後個數學原理

Calculus, linear algebra, probi
吹水台自選台熱 門最 新手機台時事台政事台World體育台娛樂台動漫台Apps台遊戲台影視台講故台健康台感情台家庭台潮流台美容台上班台財經台房屋台飲食台旅遊台學術台校園台汽車台音樂台創意台硬件台電器台攝影台玩具台寵物台軟件台活動台電訊台直播台站務台黑 洞