網路城邦
上一篇 回創作列表 下一篇   字體:
A Generalist Agent - DeepMind
2022/06/20 20:33:56瀏覽283|回應0|推薦1
追蹤 DeepMind 發展,從玩遊戲,下棋,寫文章到寫程式,真是無所不能,終於在 2022/5/12 推出 A generalist agent (通用 AI 智能體)。它的名字叫做 Gato (The agent, which we refer to as Gato, works as a multi-modal, multi-task, multi-embodiment generalist policy)。

為了能夠處理這種多模態數據,Deepmind 將所有數據序列化為一個扁平的 token 序列。 在這種表示中,Gato 可以從類似於標準的大規模語言模型進行訓練和採樣。 在部署期間,採樣的 token 會根據上下文組合成對話回應、字幕、按鈕按下或其他動作。

Below we report the tokenization scheme we found to produce the best results for Gato at the current scale using contemporary hardware and model architectures.

• Text is encoded via SentencePiece (Kudo and Richardson, 2018) with 32000 subwords into the integer range [0, 32000).
• Images are first transformed into sequences of non-overlapping 16 × 16 patches in raster order, as done in ViT (Dosovitskiy et al., 2020). Each pixel in the image patches is then normalized between [−1, 1] and divided by the square-root of the patch size (i.e. √ 16 = 4).
• Discrete values, e.g. Atari button presses, are flattened into sequences of integers in row-major order. The tokenized result is a sequence of integers within the range of [0, 1024).
• Continuous values, e.g. proprioceptive inputs or joint torques, are first flattened into sequences of floating point values in row-major order. The values are mu-law encoded to the range [−1, 1] if not already there (see figure 13 for details), then discretized to 1024 uniform bins. The discrete integers are then shifted to the range of [32000, 33024).

After converting data into tokens, we use the following canonical sequence ordering.
• Text tokens in the same order as the raw input text.
• Image patch tokens in raster order.
• Tensors in row-major order.
• Nested structures in lexicographical order by key.
• Agent timesteps as observation tokens followed by a separator, then action tokens.
• Agent episodes as timesteps in time order.

其積極意義在於證明 CV、NLP 和 RL 的結合是切實可行的,通過序列預測能夠解決一些決策智能的問題。 考慮到 Gato 模型目前的參數量只能算中等,接下來繼續往這個方向探索,構建更大的模型,將會有非常大的意義。
( 創作散文 )
回應 推薦文章 列印 加入我的文摘
上一篇 回創作列表 下一篇

引用
引用網址:https://classic-blog.udn.com/article/trackback.jsp?uid=robertyjlai&aid=175242187