Intro#
Install and import feedback gadget#
Show code cell source
# @title Install and import feedback gadget
!pip install vibecheck datatops --quiet
from vibecheck import DatatopsContentReviewContainer
def content_review(notebook_section: str):
return DatatopsContentReviewContainer(
"", # No text prompt
notebook_section,
{
"url": "https://pmyvdlilci.execute-api.us-east-1.amazonaws.com/klab",
"name": "neuromatch_neuroai",
"user_key": "wb2cxze8",
},
).render()
feedback_prefix = "W1D5_Intro"
Prerequisites#
While the first two tutorials of this day don’t use specific frameworks or modeling techniques, discussing fundamental operations using the most popular python libraries for data processing, the last tutorial discovers the attention mechanism presented in Transformers. It might be beneficial to have an idea of this architecture type, which is already presented in W1D1; thus, no further specific knowledge is assumed.
Video#
Intro Video#
Submit your feedback#
Show code cell source
# @title Submit your feedback
content_review(f"{feedback_prefix}_intro_video")