WEBVTT 00:00.000 --> 00:12.000 Okay, so I guess I have to have a opening, so I am going to ask, I want you to present 00:12.000 --> 00:17.000 a few Pythra, a framework for a couple of different controls. 00:17.000 --> 00:23.600 So, if we think of robots nowadays, it's really something like this, like the Industrial Robots 00:24.600 --> 00:28.600 in the whole sort of environment, something like that. 00:28.600 --> 00:33.600 It's if you're ready to bring somebody with a very realistic way. 00:33.600 --> 00:37.600 What you really don't have is something like this. 00:37.600 --> 00:41.600 Complex manipulation of what's that actually work in household environments, 00:41.600 --> 00:48.600 like complex dynamic environments, but a lot of variables with our stuff. 00:49.600 --> 00:51.600 Why is that? 00:51.600 --> 00:55.600 What are the problems that we have with getting robots into household environments? 00:55.600 --> 00:57.600 First point, humans. 00:57.600 --> 00:58.600 Humans. 00:58.600 --> 01:00.600 Very annoying factor. 01:00.600 --> 01:03.600 They are everywhere there, but walking around, they are predictable. 01:03.600 --> 01:05.600 We have to work with them somehow. 01:05.600 --> 01:08.600 We don't have to work them complicated. 01:08.600 --> 01:12.600 And then you mix things and contrast by example to industrial robotics. 01:12.600 --> 01:15.600 Any objects we don't have, it's called a state. 01:15.600 --> 01:18.600 Very by a little bit, you might show the centimeters. 01:18.600 --> 01:22.600 If you try to grab something that might already be much. 01:22.600 --> 01:27.600 The robots have to dynamically react to else. 01:27.600 --> 01:33.600 So if something, the robot wants to grab something and talk with someone, 01:33.600 --> 01:35.600 it has to react to that. 01:35.600 --> 01:38.600 Find a strategy to correct that and so on. 01:39.600 --> 01:42.600 On the same topic, robots have to re-planned. 01:42.600 --> 01:48.600 So if they want to add an indication, they want to grab something and then it is not there. 01:48.600 --> 01:53.600 They have to find a strategy to mitigate that and to find it in another place. 01:53.600 --> 01:56.600 They would stay our guard to adapt the whole thing. 01:56.600 --> 01:58.600 And that is not an exhaustive. 01:58.600 --> 02:01.600 It's amazing not all sorts of, I don't know. 02:01.600 --> 02:04.600 So what are we going to do? 02:04.600 --> 02:10.600 Basically, robots have to be more intelligent than AI. 02:10.600 --> 02:14.600 So the robots have to understand what they are doing, 02:14.600 --> 02:18.600 not just in the way that they are knowing what movements they are doing, 02:18.600 --> 02:21.600 but they need to know the actual semantics, 02:21.600 --> 02:26.600 like the movements I'm doing right now to pick up something 02:26.600 --> 02:31.600 or I'm doing this motion to get to another position to do something there. 02:31.600 --> 02:38.600 So the robot actually knows what he is doing in the bigger context of the task he is trying to achieve. 02:38.600 --> 02:47.600 And robots need to reason about what they try to achieve and how to prove and how to mitigate else. 02:47.600 --> 02:54.600 So for example, if the robot is tasked in the kitchen, 02:55.600 --> 02:58.600 we're preparing back the breakfast. 02:58.600 --> 03:02.600 And the robot knows, okay, he needs to set up the breakfast, breakfast, 03:02.600 --> 03:08.600 we're zeroed, milk, a bowl and a spoon, classic breakfast. 03:08.600 --> 03:12.600 But robot does not know where to get to work. 03:12.600 --> 03:16.600 So he needs more information on how to get to work. 03:16.600 --> 03:19.600 We are doing a quick, easy spot. 03:19.600 --> 03:24.600 What does behavioral tradition tackle with problems? 03:24.600 --> 03:29.600 Micro-basically provides syntax to the native use grant actions 03:29.600 --> 03:32.600 objects and applications that are done, 03:32.600 --> 03:34.600 resolved during run time. 03:34.600 --> 03:38.600 So basically you can symbolically describe the location. 03:38.600 --> 03:42.600 For example, an occasional robot for staff to pick up something, 03:42.600 --> 03:44.600 they'll start resolved. 03:44.600 --> 03:48.600 And they exactly know where the robot tries to find the location. 03:48.600 --> 03:53.600 So that in the current context of the robot, it takes all the information 03:53.600 --> 04:00.600 to pass, compiles that, and then finds the solution that is fitting for the solution. 04:00.600 --> 04:04.600 Yes, the approach is, basically, of the modern. 04:04.600 --> 04:09.600 And the next thing is micro has an internal use statement, 04:09.600 --> 04:14.600 and basically takes them all sense of relevance, knowledge, 04:15.600 --> 04:18.600 but development of our objects, of sensors, 04:18.600 --> 04:22.600 and integrate all that to try to get an internal simulation 04:22.600 --> 04:27.600 that is as closely as to do real representation of the word as possible. 04:27.600 --> 04:30.600 And then we can read it on death, because in the next, 04:30.600 --> 04:34.600 we will read it on the center of our planet, all about that. 04:34.600 --> 04:38.600 And then last new Python lesson, or I'm going to basically 04:38.600 --> 04:41.600 take all the execution of you. 04:41.600 --> 04:48.600 Then put it in a database, and then you will learn from the fastest push. 04:48.600 --> 04:50.600 And put it in your time. 04:50.600 --> 04:54.600 So yeah, so this is your computer, if you want to check it out. 04:54.600 --> 04:57.600 And then, thank you.