🧐課後功課答案 12.2. ConvexityAssume that we want to verify convexity of a set by drawing all lines between points within the set and checking whether the lines are contained.
🧐課後功課答案 11.5. Self-Attention and Positional EncodingImplement distance-based attention by modifying the DotProductAttention code. Note that you only need the squared norms of the keys for an efficient implementation.
🧐課後功課答案 11.4. Multi-Head AttentionImplement distance-based attention by modifying the DotProductAttention code. Note that you only need the squared norms of the keys for an efficient implementation.
🧐課後功課答案 11.3. Attention Scoring FunctionsImplement distance-based attention by modifying the DotProductAttention code. Note that you only need the squared norms of the keys for an efficient implementation.
🧐課後功課答案 11.1. Queries, Keys, and ValuesSuppose that you wanted to reimplement approximate (key, query) matches as used in classical databases, which attention function would you pick?
😋Deep learning Guide 8: 深度循环神经网络、双向循环神经网络Despite having just one hidden layer between the input at any time step and the corresponding output, there is a sense in which these networks are deep