Misspelling Oblivious Embeddings – Fabrizio Silvestri, Facebook AI

  1. We present a novel embedding that is resilient to misspelling
  2. We modify the popular FastText algorithm with a novel loss depending on the task
  3. We present experimental evidence that the method works in practice

Key Takeaways

• A new model to learn word embeddings (words or phrases mapped to dense vectors of numbers that represent their meaning) that are resilient to misspellings.

• We propose Misspelling Oblivious Embeddings (MOE), a new model that combines our open source library fastText with a supervised task that embeds misspellings close to their correct variants.

• The loss function of fastText aims to more closely embed words that occur in the same context. We call this semantic loss. In addition to the semantic loss, MOE also considers an additional supervisedloss that we call spell correction loss. The spell correction loss aims to embed misspellings close to their correct versions by minimizing the weighted sum of semantic loss and spell correction loss.

Add comment

Highlight option

Turn on the "highlight" option for any widget, to get an alternative styling like this. You can change the colors for highlighted widgets in the theme options. See more examples below.


Instagram has returned empty data. Please authorize your Instagram account in the plugin settings .

Ivana Kotorchevikj

Categories count color


Small ads


  • Maria d'Odessa performs her art of make-up
  • Afro-deko-mono
  • Maria d'Odessa, touching
  • Maria d'Odessa au bâton de rouge-baiser
  • Maria d'Odessa & the red lipstick
  • Maria d'Odessa, soulful.
  • Peanuts
  • Celebrating the hundredth anniversary of Charles M. Schulz
  • À propos serendipity ...

Social Widget

Collaboratively harness market-driven processes whereas resource-leveling internal or "organic" sources.