有無非image processing 類既deep learning 講解下?
定係其實都只係當d data 係一幅n dimension 既graph?
識好多concept野, 但tensorflow認真難用
而家就好似學左好多內功
冇武功咁
都唔知做咩春.....
pytorch
感覺tensorflow強大D
橫掂都係
你玩得掂TF先得架
唔好聽到google個名就覺得好勁(雖然真係好勁),但真係elephant in a room
pytorch一出就好多人用,因為佢夠簡潔效率又高,development時間都係成本黎架
有無非image processing 類既deep learning 講解下?
定係其實都只係當d data 係一幅n dimension 既graph?
識好多concept野, 但tensorflow認真難用
而家就好似學左好多內功
冇武功咁
都唔知做咩春.....
pytorch
感覺tensorflow強大D
橫掂都係
你玩得掂TF先得架
唔好聽到google個名就覺得好勁(雖然真係好勁),但真係elephant in a room
pytorch一出就好多人用,因為佢夠簡潔效率又高,development時間都係成本黎架
單計performance pytorch定tf好?
有無非image processing 類既deep learning 講解下?
定係其實都只係當d data 係一幅n dimension 既graph?
我諗應該係倒返轉,image classification本身就係將image睇成一個n-dimensional 既data
個model事實上你唔止可以fit個image落去,fit咩都冇問題架喎
反而我想知好似CNN呢d直接inspired by human vision system既model,係image processing以外有d咩應用
樓主會唔會講LSTM同dropout rate? Time series 一般都會用LSTM
樓主會唔會講LSTM同dropout rate? Time series 一般都會用LSTM
dropout 新一代network唔會用了,原理係randomly停左某d neuron 唔做update去防止overfitting
去到ResNet 改左用residual,無左dropout機制, 有人話shortcut呢樣野本質上就係0.5機率的dropout
樓主會唔會講LSTM同dropout rate? Time series 一般都會用LSTM
dropout 新一代network唔會用了,原理係randomly停左某d neuron 唔做update去防止overfitting
去到ResNet 改左用residual,無左dropout機制, 有人話shortcut呢樣野本質上就係0.5機率的dropout
Fc layer都唔dropout 定係冇FC Layer
樓主會唔會講LSTM同dropout rate? Time series 一般都會用LSTM
dropout 新一代network唔會用了,原理係randomly停左某d neuron 唔做update去防止overfitting
去到ResNet 改左用residual,無左dropout機制, 有人話shortcut呢樣野本質上就係0.5機率的dropout
Fc layer都唔dropout 定係冇FC Layer
應該dropout問題仲研究緊,FC layer係last果層做,去到呢個位dropout唔知仲有幾重要
我睇左hinton最新(?)篇capsule net, 好似完全無講dropout,類似行為改左做dynamic routing,呢篇文好重要,出左3星期超多人discuss, 好有可能revolutionize成個CNN體系,我同個fd一邊研究一邊o 曬咀
佢有篇2018年(?)的論文加深左caspule net同改左routing algorithm, 呢兩日得閒d會睇下又有無咩新insight
樓主會唔會講LSTM同dropout rate? Time series 一般都會用LSTM
dropout 新一代network唔會用了,原理係randomly停左某d neuron 唔做update去防止overfitting
去到ResNet 改左用residual,無左dropout機制, 有人話shortcut呢樣野本質上就係0.5機率的dropout
Fc layer都唔dropout 定係冇FC Layer
應該dropout問題仲研究緊,FC layer係last果層做,去到呢個位dropout唔知仲有幾重要
我睇左hinton最新(?)篇capsule net, 好似完全無講dropout,類似行為改左做dynamic routing,呢篇文好重要,出左3星期超多人discuss, 好有可能revolutionize成個CNN體系,我同個fd一邊研究一邊o 曬咀
佢有篇2018年(?)的論文加深左caspule net同改左routing algorithm, 呢兩日得閒d會睇下又有無咩新insight
姐係CNN 可以做到time series ?
樓主會唔會講LSTM同dropout rate? Time series 一般都會用LSTM
dropout 新一代network唔會用了,原理係randomly停左某d neuron 唔做update去防止overfitting
去到ResNet 改左用residual,無左dropout機制, 有人話shortcut呢樣野本質上就係0.5機率的dropout
Fc layer都唔dropout 定係冇FC Layer
應該dropout問題仲研究緊,FC layer係last果層做,去到呢個位dropout唔知仲有幾重要
我睇左hinton最新(?)篇capsule net, 好似完全無講dropout,類似行為改左做dynamic routing,呢篇文好重要,出左3星期超多人discuss, 好有可能revolutionize成個CNN體系,我同個fd一邊研究一邊o 曬咀
佢有篇2018年(?)的論文加深左caspule net同改左routing algorithm, 呢兩日得閒d會睇下又有無咩新insight
姐係CNN 可以做到time series ?
CNN做time series classification 係有的,有人拎來做鯨魚聲音分類,亦見有人拎淺層CNN來做dna prediction,但係唔太多人咁做,只係2D array轉返去1D
單純CNN結構應用係algo trading可以拎來搵上升/下降信號,或者估交易信號/regession
反而cnn+rnn hybrid model就理性一d
當係股票market,你有當刻高位,低位,交易量之類,變n 個input x 1D time series的input, 其實就等價於一張圖片,一樣可以做返樓上講的野
用CNN 單做估升跌/regression作用不大,因為time dimension extrapolation 一定效果唔好。有人拎CNN來做signal filter, 過走曬d noise先放入RNN/LSTM, 好似話會準d
做time series 用LSTM為主幹啦,而家做stock prediction好似得60%準確度,對比做lung cancer detection準確度去到99.97%果d強大的classifer, 股票應用真係對AI 無乜貢獻,有d finance professor夠膽死出paper淨係寫用左NN去train唔講個training model係乜野樣,淨係output d kernel俾你睇,然後話自己好準,估都估到係d淺層NN
樓主會唔會講LSTM同dropout rate? Time series 一般都會用LSTM
dropout 新一代network唔會用了,原理係randomly停左某d neuron 唔做update去防止overfitting
去到ResNet 改左用residual,無左dropout機制, 有人話shortcut呢樣野本質上就係0.5機率的dropout
Fc layer都唔dropout 定係冇FC Layer
應該dropout問題仲研究緊,FC layer係last果層做,去到呢個位dropout唔知仲有幾重要
我睇左hinton最新(?)篇capsule net, 好似完全無講dropout,類似行為改左做dynamic routing,呢篇文好重要,出左3星期超多人discuss, 好有可能revolutionize成個CNN體系,我同個fd一邊研究一邊o 曬咀
佢有篇2018年(?)的論文加深左caspule net同改左routing algorithm, 呢兩日得閒d會睇下又有無咩新insight
姐係CNN 可以做到time series ?
CNN做time series classification 係有的,有人拎來做鯨魚聲音分類,亦見有人拎淺層CNN來做dna prediction,但係唔太多人咁做,只係2D array轉返去1D
單純CNN結構應用係algo trading可以拎來搵上升/下降信號,或者估交易信號/regession
反而cnn+rnn hybrid model就理性一d
當係股票market,你有當刻高位,低位,交易量之類,變n 個input x 1D time series的input, 其實就等價於一張圖片,一樣可以做返樓上講的野
用CNN 單做估升跌/regression作用不大,因為time dimension extrapolation 一定效果唔好。有人拎CNN來做signal filter, 過走曬d noise先放入RNN/LSTM, 好似話會準d
做time series 用LSTM為主幹啦,而家做stock prediction好似得60%準確度,對比做lung cancer detection準確度去到99.97%果d強大的classifer, 股票應用真係對AI 無乜貢獻,有d finance professor夠膽死出paper淨係寫用左NN去train唔講個training model係乜野樣,淨係output d kernel俾你睇,然後話自己好準,估都估到係d淺層NN
咁岩我畢業論文寫 論機械人的道德考量
巴打有冇reference?
Philo朋友推薦,唔知岩唔岩你睇
https://www.google.com.hk/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0ahUKEwirmKL27eLXAhVHjLwKHTVaANcQFgglMAA&url=http%3A%2F%2Fmichaeljohnsonphilosophy.com%2Fwp-content%2Fuploads%2F2012%2F11%2FMechanical-Mind.pdf&usg=AOvVaw08fsReaqnKcP_2a4Axkcvr
樓主會唔會講LSTM同dropout rate? Time series 一般都會用LSTM
dropout 新一代network唔會用了,原理係randomly停左某d neuron 唔做update去防止overfitting
去到ResNet 改左用residual,無左dropout機制, 有人話shortcut呢樣野本質上就係0.5機率的dropout
Fc layer都唔dropout 定係冇FC Layer
應該dropout問題仲研究緊,FC layer係last果層做,去到呢個位dropout唔知仲有幾重要
我睇左hinton最新(?)篇capsule net, 好似完全無講dropout,類似行為改左做dynamic routing,呢篇文好重要,出左3星期超多人discuss, 好有可能revolutionize成個CNN體系,我同個fd一邊研究一邊o 曬咀
佢有篇2018年(?)的論文加深左caspule net同改左routing algorithm, 呢兩日得閒d會睇下又有無咩新insight
姐係CNN 可以做到time series ?
CNN做time series classification 係有的,有人拎來做鯨魚聲音分類,亦見有人拎淺層CNN來做dna prediction,但係唔太多人咁做,只係2D array轉返去1D
單純CNN結構應用係algo trading可以拎來搵上升/下降信號,或者估交易信號/regession
反而cnn+rnn hybrid model就理性一d
當係股票market,你有當刻高位,低位,交易量之類,變n 個input x 1D time series的input, 其實就等價於一張圖片,一樣可以做返樓上講的野
用CNN 單做估升跌/regression作用不大,因為time dimension extrapolation 一定效果唔好。有人拎CNN來做signal filter, 過走曬d noise先放入RNN/LSTM, 好似話會準d
做time series 用LSTM為主幹啦,而家做stock prediction好似得60%準確度,對比做lung cancer detection準確度去到99.97%果d強大的classifer, 股票應用真係對AI 無乜貢獻,有d finance professor夠膽死出paper淨係寫用左NN去train唔講個training model係乜野樣,淨係output d kernel俾你睇,然後話自己好準,估都估到係d淺層NN
我好似睇過你講果篇paper
係咪stanford既?
dropout 新一代network唔會用了,原理係randomly停左某d neuron 唔做update去防止overfitting
去到ResNet 改左用residual,無左dropout機制, 有人話shortcut呢樣野本質上就係0.5機率的dropout
Fc layer都唔dropout 定係冇FC Layer
應該dropout問題仲研究緊,FC layer係last果層做,去到呢個位dropout唔知仲有幾重要
我睇左hinton最新(?)篇capsule net, 好似完全無講dropout,類似行為改左做dynamic routing,呢篇文好重要,出左3星期超多人discuss, 好有可能revolutionize成個CNN體系,我同個fd一邊研究一邊o 曬咀
佢有篇2018年(?)的論文加深左caspule net同改左routing algorithm, 呢兩日得閒d會睇下又有無咩新insight
姐係CNN 可以做到time series ?
CNN做time series classification 係有的,有人拎來做鯨魚聲音分類,亦見有人拎淺層CNN來做dna prediction,但係唔太多人咁做,只係2D array轉返去1D
單純CNN結構應用係algo trading可以拎來搵上升/下降信號,或者估交易信號/regession
反而cnn+rnn hybrid model就理性一d
當係股票market,你有當刻高位,低位,交易量之類,變n 個input x 1D time series的input, 其實就等價於一張圖片,一樣可以做返樓上講的野
用CNN 單做估升跌/regression作用不大,因為time dimension extrapolation 一定效果唔好。有人拎CNN來做signal filter, 過走曬d noise先放入RNN/LSTM, 好似話會準d
做time series 用LSTM為主幹啦,而家做stock prediction好似得60%準確度,對比做lung cancer detection準確度去到99.97%果d強大的classifer, 股票應用真係對AI 無乜貢獻,有d finance professor夠膽死出paper淨係寫用左NN去train唔講個training model係乜野樣,淨係output d kernel俾你睇,然後話自己好準,估都估到係d淺層NN
我好似睇過你講果篇paper
係咪stanford既?
係standford果篇
似係undergrad既文來,CS人寫
最近學緊cnn, 發現有篇文講到fcn, 全部只用convolutional layer, 做image segmentation更準, 比cnn compute更快, 請問樓主點睇?
樓主會唔會講LSTM同dropout rate? Time series 一般都會用LSTM
dropout 新一代network唔會用了,原理係randomly停左某d neuron 唔做update去防止overfitting
去到ResNet 改左用residual,無左dropout機制, 有人話shortcut呢樣野本質上就係0.5機率的dropout
Fc layer都唔dropout 定係冇FC Layer
應該dropout問題仲研究緊,FC layer係last果層做,去到呢個位dropout唔知仲有幾重要
我睇左hinton最新(?)篇capsule net, 好似完全無講dropout,類似行為改左做dynamic routing,呢篇文好重要,出左3星期超多人discuss, 好有可能revolutionize成個CNN體系,我同個fd一邊研究一邊o 曬咀
佢有篇2018年(?)的論文加深左caspule net同改左routing algorithm, 呢兩日得閒d會睇下又有無咩新insight
姐係CNN 可以做到time series ?
CNN做time series classification 係有的,有人拎來做鯨魚聲音分類,亦見有人拎淺層CNN來做dna prediction,但係唔太多人咁做,只係2D array轉返去1D
單純CNN結構應用係algo trading可以拎來搵上升/下降信號,或者估交易信號/regession
反而cnn+rnn hybrid model就理性一d
當係股票market,你有當刻高位,低位,交易量之類,變n 個input x 1D time series的input, 其實就等價於一張圖片,一樣可以做返樓上講的野
用CNN 單做估升跌/regression作用不大,因為time dimension extrapolation 一定效果唔好。有人拎CNN來做signal filter, 過走曬d noise先放入RNN/LSTM, 好似話會準d
做time series 用LSTM為主幹啦,而家做stock prediction好似得60%準確度,對比做lung cancer detection準確度去到99.97%果d強大的classifer, 股票應用真係對AI 無乜貢獻,有d finance professor夠膽死出paper淨係寫用左NN去train唔講個training model係乜野樣,淨係output d kernel俾你睇,然後話自己好準,估都估到係d淺層NN