DARRENDEV.BLOGSPOT.COM
Darren's Developer DiaryWednesday, December 6, 2017. Word Embeddings, NLP and I18N in H2O. Word embeddings can be thought of as a dimension-reduction tool, needing a sequence of tokens to learn from. They are really that generic, but I’ve only ever heard of them used for languages; i.e. the sequences are sentences, the tokens are words (or compound words, or n-grams, or morphemes). Here is the preparation code, in R; bring in H2O, and define a couple of helper functions. Is using PCA to just show the first two dimensions. Here ...
http://darrendev.blogspot.com/


