How deep you are involved with Natural Language Processing (NLP)
Thread poster: Joel Pina Diaz

Joel Pina Diaz  Identity Verified
Argentina
Local time: 13:12
Member (2009)
English to Spanish
+ ...
May 17

Among several, the neural network-based technique for natural language processing (NLP) pre-training, called Bidirectional Encoder Representations from Transformers, or "BERT", is changing your models to search either from ranking and/or featured snippets cracking your queries mostly in US English but near future will add several languages (a market open for translators).

"One of the biggest challenges in natural language processing (NLP) is the shortage of training data. Because N
... See more
Among several, the neural network-based technique for natural language processing (NLP) pre-training, called Bidirectional Encoder Representations from Transformers, or "BERT", is changing your models to search either from ranking and/or featured snippets cracking your queries mostly in US English but near future will add several languages (a market open for translators).

"One of the biggest challenges in natural language processing (NLP) is the shortage of training data. Because NLP is a diversified field with many distinct tasks, most task-specific datasets contain only a few thousand or a few hundred thousand human-labeled training examples. However, modern deep learning-based NLP models see benefits from much larger amounts of data, improving when trained on millions, or billions, of annotated training examples. To help close this gap in data, researchers have developed a variety of techniques for training general purpose language representation models using the enormous amount of unannotated text on the web (known as pre-training). The pre-trained model can then be fine-tuned on small-data NLP tasks like question answering and sentiment analysis, resulting in substantial accuracy improvements compared to training on these datasets from scratch..."

Language understanding remains an ongoing challenge and we are in the trend to ride up...

Full information in the following link (Google A.I. Blog): https://ai.googleblog.com/2018/11/open-sourcing-bert-state-of-art-pre.html
Collapse


 


To report site rules violations or get help, contact a site moderator:

Moderator(s) of this forum
Maria Castro[Call to this topic]

You can also contact site staff by submitting a support request »

How deep you are involved with Natural Language Processing (NLP)

Advanced search






PerfectIt consistency checker
Faster Checking, Greater Accuracy

PerfectIt helps deliver error-free documents. It improves consistency, ensures quality and helps to enforce style guides. It’s a powerful tool for pro users, and comes with the assurance of a 30-day money back guarantee.

More info »
CafeTran Espresso
You've never met a CAT tool this clever!

Translate faster & easier, using a sophisticated CAT tool built by a translator / developer. Accept jobs from clients who use SDL Trados, MemoQ, Wordfast & major CAT tools. Download and start using CafeTran Espresso -- for free

More info »



ProZ.com Headquarters
235 Harrison Street Mail Drop #22
Syracuse, NY 13202
USA
+1-315-463-7323
ProZ.com Argentina
Calle 14 nro. 622 1/2 entre 44 y 45
La Plata (B1900AND), Buenos Aires
Argentina
+54-221-425-1266
ProZ.com Ukraine
6 Karazina St.
Kharkiv, 61002
Ukraine
+380 57 7281624
Dawn it-tradutturi jikkoordinaw it-traduzzjoni ta’ ProZ.com f’ Maltese

Team Members: Rita Briffa

Jekk jogħġbok innota li s-sit għadu mhux tradott kollu. Il-lokalizzazzjoni tas-sit qed jipproċedi fi stadji, bis-siti l-aktar attivi jiġu tradotti l-ewwel. Jekk tara xi errur fit-traduzzjoni fi kwalunkwe parti tas-sit li diġà ġie lokalizzat, jekk jogħġbok avża lil wieħed mill-koordinaturi tal-lokalizzazzjoni hawn fuq.
For information on how you can help localize the site, please click here.

Forums
  • All of ProZ.com
  • Fittex għal terminu
  • Xogħol
  • Fora
  • Multiple search