[[Word Embeddings]]
## Questions
- Can you make embeddings hierarchical e.g. an embedding for the whole doc + 1 embedding per paragraph + 1 embedding per sentence
- e.g. how might you relate word embeddings to sentence embeddings
- could take the view that the tensor product of all word embeddings forms the tensor product space of potential sentences we can make with those words
- how does that relate to positional encodings
- what sort of topological object / space does that give use
- how could you feed that back in as model input?