Tag

natural language processing

Browsing

https://www.youtube.com/embed/fOvTtapxa9c Hi, I’m Carrie Anne, and welcome to Crash Course Computer Science! Last episode we talked about computer vision – giving computers the ability to see and understand visual information. Today we’re going to talk about how to give computers the ability to understand language. You might argue they’ve always had this capability. Back in Episodes 9 and 12, we talked about machine language instructions, as well as higher-level programming languages. While these certainly meet the definition of a language, they also tend to have small vocabularies and follow highly structured conventions. Code will only compile and run if it’s 100 percent free of spelling and syntactic errors. Of course, this is quite different from human languages – what are called natural languages – containing large, diverse vocabularies, words with several different meanings, speakers with different accents, and all sorts of interesting word play.People also make linguistic faux pas when…

https://www.youtube.com/embed/28RnXJrmmtE 30 years ago when you wanted to find out information, you would probably buy a newspaper, or go to the library. Today, people go to the internet where powerful web-based search engines allow consumers to enter the information they're looking for, and the results appear before them immediately. It's odd that most ERP applications today still require cumbersome step by step procedures to serve up results that offer less than what the user requires. TrenData offers the most powerful People Analytics solution on the market today, but what we are most proud of is the systems ease of use A big reason for this is our Natural Language Processing capabilities, NLP. Our NLP search allows users to quickly find filtered metrics and analytics throughout our system without having to step through laborious menus or workflows. Just type in the information you are looking for, and like a web search,…

https://www.youtube.com/embed/ogrJaOIuBx4 (siraj) Hello, world! It's Siraj, and we're going to make an app that reads an article of text and creates a one sentence summary out of it using the power of natural language processing. Language is in many ways the seat of intelligence. It's the original communication protocol that we invented to describe all the incredibly complex processes happening in our neocortex. Do you ever feel like you're getting flooded with an increasing amount of articles and links and videos to choose from? As this data grows, the importance of semantic density does as well. How can you say the most important things in the shortest amount of time? Having a generated summary lets you decide whether you want to deep dive further or not. And the better it gets, the more we'll be able to apply it to more complex language, like that in a scientific paper or…

https://www.youtube.com/embed/oi0JXuL19TA Thanks to Curiosity Stream for supporting PBS Digital Studios. Hey, I’m Jabril and welcome to Crash Course AI! Language is one of the most impressive things humans do. It’s how I’m transferring knowledge from my brain to yours right this second! Languages come in many shapes and sizes, they can be spoken or written, and are made up of different components like sentences, words, and characters that vary across cultures. For instance, English has 26 letters and Chinese has tens-of-thousands of characters. So far, a lot of the problems we’ve been solving with AI and machine learning technologies have involved processing images, but the most common way that most of us interact with computers is through language. We type questions into search engines, we talk to our smartphones to set alarms, and sometimes we even get a little help with our Spanish homework from Google Translate.So today, we’re going…

https://www.youtube.com/embed/KVxIx8f_VpM Hey guys, welcome to this session by Intellipaat. Now, you guys should have used a lot of virtual assistants such as Siri, Cortana and Alexa and these virtual assistants really do make a task very easy, don't they? So, let's say if you ask Siri what is the distance between Earth and the Sun, it will immediately reply the distances 149.6 million kilometers and similarly, if you ask Siri, "What is I'm hungry?" in Hindi, it will reply, "Mai bhukha hu." Now Siri or any other virtual assistant how is it able to do this? Well all of this is possible because of Natural Language Processing so keeping in mind how important NLP is, we have come up with the tutorial on Natural Language Processing.So let's have a quick glance at the agenda, we'll start by understanding what exactly NLP is and then we'll learn how to tokenize words…

https://www.youtube.com/embed/tMAU3gLbKII Hi, I'm Ines. I'm the co-founder of Explosion AI, a core developer of spaCy and the lead developer of Prodigy, our annotation tool for machine learning and NLP. It's been really really exciting to see Prodigy grow so much over the past year, talk to so many people in the community, see what they're working on and discuss strategies for creating training data for machine learning projects. Coming up with the right strategy is often genuinely difficult and requires a lot of experimentation. The most critical phase is the early development phase when you try out ideas, design your label scheme and basically decide what you want to predict. We've built Prodigy to make this process easier and to help you iterate on your code and your data faster. In this video I'll be talking about a few frequently asked questions that have come up on the forum. I'll…

https://www.youtube.com/embed/sqDHBH9IjRU Last week, we released version 2 of our natural language processing library spaCy which gets the library up to date with the latest deep learning methodologies for solving tasks such as tagging, parsing and named entity recognition So naturally, we've had a lot of questions about details about how the statistical models that we've used are working and why we did things the way that we did. So in order to answer that in sort of an easy way, I've decided to dust off some slides from a presentation I gave recently at Zalando Research to take you through the thought process behind this and also introduce you to, how we think about what we're doing here and some of the ins and outs and whys and things. So this is the overview of the activities that we're doing at Explosion AI. spaCy, as I said, is this open…

https://www.youtube.com/embed/ys8MzznzI24 [LAUGHTER] And OK. Good afternoon. And welcome to, I believe, the first NLP talk of the autumn quarter, 2019. We're live streaming thanks to some very nimble help from computer science and engineering support. We're broadcasting live. I'm delighted today to introduce two speakers. I'll introduce the first one. We'll hear the talk. And then, after questions, we'll move to the second speaker. And I'll introduce him when it's time, assuming his slides are ready when it's time for him. [LAUGHTER] OK. No pressure. So first up today, we have Nathan Schneider, who is an assistant professor in computer science and linguistics at Georgetown University, where he's been for about three years. Prior to that, he was a postdoc at the University of Edinburgh working with Mark Steedman. And prior to that, he was a PhD student working in my group back when we were at Carnegie Mellon. And…

https://www.youtube.com/embed/ERibwqs9p38 [MUSIC] Stanford University. >> Okay, so let's get going. Welcome back to the second class of CS224N /Ling 284, Natural Language Processing with Deep Learning. So this class is gonna be almost the complete opposite of the last class. So in the last class, it was a very high level picture of sort of trying from the very top down. Sort of say a little bit about what is natural language processing, what is deep learning, why it's exciting, why both of them are exciting and how I'd like to put them together? So for today's class we're gonna go completely to the opposite extreme. We're gonna go right down to the bottom of words, and we're gonna have vectors, and we're gonna do baby math. Now for some of you this will seem like tedious repetitive baby math. But I think that there are probably quite a few…

https://www.youtube.com/embed/bZMKhQSERA4 you welcome Chris it's my pleasure to welcome you to Microsoft Research where you'll be giving a distinguished a I seminar later today and for the audience chris is the Seibel professor of machine learning at Stanford University where he has an appointment in both linguistics computer science you had the Stanford AI lab and you've recently become a part of the Stanford human centered computing initiative the high effort you wear many hats and one of the things that I've loved it about your career is its focus on statistical NLP ranging from you know from work on parsing to grammar induction to lots of applications in information retrieval and machine translation and stuff you're amazingly well respected and successful you have more than a hundred thousand citations I think two very popular text books you're a fellow of the ACM the triple AI ACL is that right…