From 049186eb943c45591cb540ce9000b545fc4b2bd2 Mon Sep 17 00:00:00 2001 From: MattRoweEAIF Date: Tue, 15 Oct 2024 18:31:29 +0200 Subject: [PATCH] adjusted mkdocs.yml settings --- Docs/carla_ue5_logo.png | Bin 0 -> 33749 bytes Docs/tuto_A_add_vehicle.md | 333 --------- Docs/tuto_G_retrieve_data.md | 1343 ---------------------------------- Docs/tutorials.md | 10 +- mkdocs.yml | 8 +- 5 files changed, 2 insertions(+), 1692 deletions(-) create mode 100644 Docs/carla_ue5_logo.png delete mode 100644 Docs/tuto_A_add_vehicle.md delete mode 100644 Docs/tuto_G_retrieve_data.md diff --git a/Docs/carla_ue5_logo.png b/Docs/carla_ue5_logo.png new file mode 100644 index 0000000000000000000000000000000000000000..e859f49d3a3974e46f3c12996377b9828eac6ff6 GIT binary patch literal 33749 zcmeFYcTiN#(?5Eaj3A1jfJDieg++2yqT~!pT9yS@a?a5JNRpg`0s;~wBRNP01p&!9 z=Nw(aJv`4h*8Nr8daLg5zZZ&PcTP`F_h-6$dV1!BsH@7~zV+}H006fY6=XC4022!S z731N6Eo0^ftl+;cPi=i?O}HDKy`!B4${I=M>|u|jL%O3Z0Kk31MqPvGJ00HTDXAz1 z#&e+vBbzeHt+S^s*)}Ji{O|Z+Y5I4*uFc4#pq`u_51a2j8L0sTJ4OKa6(PW5sH-B=^~(2$KKJtct; zeooI7B#B0!*RH(_-va>1Th!C1>WWXF{!=GVlT@E5afRTHis;dMSx)IdJZz>V z4gYXxkqFy^g$i2K0GL*gF`F#){85;*z`bnQOT+Nvb@Gfz0%; zrD#g+at+k19P3VYu48`4EH)EtZg~Vv&e=L<-nF&c+Mb0~UikthN{$&GZ#=s}<8w!W zMy9W#EP}AJ;f9;rnIXB|ZR|no0{}5ecY8R(3h7K|hO|W4iqr4bHPO?d%*E;T9;-lA z?4KfEq7*zGky@Ur+6Yf8gs?fiq{J;TcM*_*4bmA-=Wb(d>m=eXPX8BQ5%BX(GY>u8 zUnI^};`I6|>U2--9FcVV-2B{7E?IY!D<8eYEjlqra|;no8M%M30KbXTzjSuC7vbS? zb93W%gK^tATJrD;3k&l=`FQyFxIhXnCl6a^xI34v6T=OQzj??Yoe+*FduNoLE!_=I zxS5@cvp79H_?+&a^4Zv{sQin)thQYUa!-v1g>UQtE; zUp;OhuteF||J4fw`@bliQ5OFO>%U~XdGeP!|0xLA{a<|li~66s|3wT^si=s^*dbhQ z#8Z?Jr@t9r#M};nG8g&lBfk(5$tPsN$0dYB@Nw~*@xi&w;CyCWkKz2haDFp>{>Q>X z|G`Sp*2x)ei$LD60-1B8KtBAuh{t^VkA=99f=F{N@Eb3e5Fdh{OZYJY#wP@aAs?Fy z{|5^-M-*t4aO?k+>V}m$$Vvb%41@9t3UECZH5O3BvdhpwS3H`Gp1k zVr7mHk+XBOfrHbDvVmJ7dF*X1|9Wr(xQMj6qBuPtH}u~v>eg^)3$TMYy)w$y#r@w8 zv{5!lEob-*n!JL1f>2>zn84%50)j%quzwTjA|0JTOT6L93+0CW4am*3h=9s~z`}3r z6lCyM4^S78r;bRtvz?>1ot?Ef{f$U;H$DILSOs(^bGS2H2JVamnL_#aMWDhWd@yZZ zm@W#RGvMSbHubYg!`xdO@w9N*)wroTN(3+eFpx4*wyqyBOw zI=a7{LIjTZTLmY$E7JTgIYC~3KSI2O+gc*Q=g$T|JSD9@G^8dpp(nS|1c> zGSq=Y!Zy3)^u5mm%9rYZpWDDSWu zgPn~Ap9x?Lj0TxQ0DM)3g{y$sMb49Qq2Wyg@qWYT)y41NGm3G(q*-d>Sr$rq`UNNA zeMyz0@*UYW!KwLOA7T2&rC&#<$(%rDuFjSkNu6Nx=^vS`(_iYjv=PB-EWX<@dKbd< z3!B9IEc6SK)a!@W@UXoWJbxlMV6SHq63o*`A54=sLp;y*tEc3k+vb zo&-QKKk@#E|Lm^sRzj9WE|2matWjj)H2qTs{|aFku##_A_;JxV@l?r|7n0X$k-fXF zzbv9s+IVgC78m-JgE=PA5l0SS@Q2WmYzDhSTTdIvB6Ny>^Qi5?W;n$BDV%Utd}QsT zy&p8=nF3t{8e3|&DSKJI8z(;WaR|)dT_Nlv4WbSghMD4zYiaGE|0Zij5mq}~3%nRN zOk=Y!eKv~;Te-&vgC1f*ce`yNguvl**|uEfGvPa}4bVz9AChZQ7*t??l&m>1fVdfe zmdEBGA{iEb$w-EJUoK!tW*wB}z!Q=7H8$!1=TJv>9H%cj8dBq?>_CfL!8z zxrI?9*Xu^rvp-&NHVQYrKkra5=nxbWge22ZKB_8q6E2f#f_9=L&QLv*uxs}{e^}p> zAb&BV#reqT);Pa$%AO^@DHasyGbKTFGH1*jG|;g4;<(2RMrzO;I&;FHe@PTt^8T_t z+NXhpm`w2Ew71Q4NjjTL8ez~eg}F$tgar}-w@Osbk~m&k^@thX$2q7}OL9(9fuz3ERCx#MO+C%3aLl?1CtYZN9+1 zk$%JB0V*qIvn}J8b$@ft!-VVNFw!HejLE#e4A!^BB5Ojv@Acw|Zh{H#uYO{oo$qF|7;;S^ExW=k@nuD|edA)sY=5wlmlz!SstrMwa5VYX#0`5w%g#P=%3 zzR*!|M$Nm!O$G61KS~lEL+)m6=2(Voh|EjiI1_h>vslL-fDsn9&`KuepH{uOg}ncU4+QH%a7=VvdRV z`Th1KY{X8$x$fVDD%s&nX%R;Tnh`oPo?&5yM-+){s*gyTl_~#r4^GV+J8Pm|wCPdC z58H_(RhQ-h6@@`fG?kfeoA%u}>H{`p_ed7`DSh{$f~c+sNrcYlEnwxCxastUPWRt* zmegp1M_L$fd!r;@^jP;a*k|)0Ph&a=O+SB&Y$NQeIJX6(17-=Ue3U@w0rc~@og*>v zE~ED}3&XYm^iG&AKQoiL&`f9?yR7b_kCwoSRZw*!Kz2;cosAU8=nY;ImS0wKfU#rM?YXg|gRGRT-Mn2QaeHlJ!dygY@ z{9_V9E%e77Zy50+@Hj~_}(86M+ z14A*?<9)=D-vi=n#n1h_iZb3Q0Q0zAXmAra)GL{bunoYUE!E|=`)uq`8oPXH2L1+G z6wmlzLnb1wW5#N)DdOThL#8c*X8-Vd9bdw42Sp@lXh3rXqB@hxeqX5z*l?yo6VRYN zSx(5Ftt{}J^p5}Wyo*vAm=l3PyH11cLM{t0Xvx6n`*|1xtX^1;L22K5fvh=ODDraudnG!Kcz+b}4g!S`N{Rn8^NT{5Y_|G#%7>T)5xt6l z^lf6VMgl7qk%*?g4VDKVI%z|K`(@g4bHg~mAnWqd%r5|O6VUqP*ti+C9e)b8mi8=B zG>4tTu%L(MPqubd&ojuPMjjgJ-RsKW8?y~1=tf7%wfWsA_EP|-!=<22YN=eKqqO_0BLylqR2;99 zeKT`LWj*CAq45o8_3U*+@Y(B9$x17H(ct{uGxUHGOww@sI;FD zjhYL4&P88#aPg;AK%LR(Oqiz2w|#%}b@B}&4jX!id&5}&Nxa+mWmNFl>CKYm&!pdZ zW!4c(ghu9gMDwLM^2GCEu={t5`coWQf}~A;g7jC^5M@bP%~J^JGHdD3C3Iuv)Pa~d z1I!LjE^4c5FY47Ru%NJ;=!cH1b)FV%zq;3#a6Y0Jn=!hu&Uv`|KKPpMb4LA{CQ#y@ zq97X6WT0U0mh?eTwEfi)9is3kl)>*LBV>QK_l#El;=)fT-seuAm_Tv=f<#L5*(b7zP9S)5oY)5 z4xto9@>oAxge#ijr!Nfi%YAu%*i%Cbh*#<^FN+jJb~Uk zC)E>vT=b0(iA~n8&U1EVo5oy{o9+_(O>($rWo%Bw+Q?iqM){2-Jo7>UFy3Yv%z0@qU5a#g!x_@hr@d=JH4I**1gEt|U3}^)gof zIjwwKzgxg}y;{{xoxA(X!Dq`mO_yeRG&FA4yNA=)UQ;>!od-CED@5s?={bK?roFJ5 zTT|sGR=SyXH}#oT#&sPiha1STpT((FkTd{!?$?c4Zr-A@YGhDQLt30Ah(#z5s@^LZED*c#_xoC2n)rbfQM-4 zC9A7RZW;H_TJN_K^V&pCMdns2Rk`k-AnDYyJ{M0{<7cOLJOT%rv|r6PmXK6n15y!EW~& zceF~U5P3mnclzpmcwx|E&?UJJA08$i^2DDF0v{|$E_b~{#&##?PQ-k~H0ZjWsq5?t z=^{I}LX!g6t|^Qu5R&6T^1kvvz_q!JeK4L|JnT{#Q_za!b(G-u5c3+B+G6y{m`U*2 zC4=`aZ6wNKt|N7s7gI|hjfmgpVS5PlmV{UIw(6Gc#6{FfeYTqs z`NcIYNjqeiea3ha^q7zsdKods;{#@O*LDmm*Sq*+2CDOu%w3DuUpZ2qej_866mGG+ zn*1W=|L1W?c$?hF^tcFb<7GqKe;lzY~Onr+-pDe)%)jYlh9KsFst;Hpl_^OUA;I^&N#M| z>${$1^G|AXQ#Q2t$*{19pK&NC+a{U1=X6zB$~Pq&4|wRyPn34&&W|m{ilGhrzt)hU zlepzMP7h0ye0Qor+TcrZ(p{J9i)Uon-jC>g+32qn6CxNEGK%=Gb0^xa_k+)hcVrrK@rYNMO+nG;GOD`gyr0<#=@ko*$a`v*bU2X-RUtb=a zCS7VE&TY@V5PuF4$Zv>P$V%n=%eNsuKDw5F=n|q_3lA^@5InST*yShfGYJhbOK-1^ zeoSFvs^`b59xaxTA@EXy-DCG$UKqGpHa&SzKbY8i-o)+2(j7XA{0iWhvL%n%btvO| zJ|7nhj+NT;6{`JH+^F%O@8Ms0b>DjEO=6eb*C_VvLd4dU29=Vk_Pw;nZC|C#bfj_2 ztIRFB{%-SBA0eR>5-c`^|FyKl1XY3AqZdZE=y@=wd;Te+KBkmDU3}YW9o(;+p9h}_ z8(f4nkr7{mrIfrNl9B_b2dLaFp^iVxzR0azLkIey(Dej{1yT8**Rz}%k_rf1@>y`r zQ=+L3CdD0a}G#blj>abj{r(ZQeu+=R>^B=8Sk&J2s4dUzdcuN9`C9( zmUw{Hn0k1Mbume?v@6MEe@=IKOqkx;L`vMGE^U0JEpzekH<{FQRY>?7Q8}zMrTub^ z(RBaWSj9%)y5{BA>E-^JC(DH-i37J53~Hcw4A6%DxwUF%r^m%e5JOp^u%mmoGt^t)mx5L&8g&8dns5=d^d%qvwp4XP#5QbZg13JukZ-MhhF z`E&o8Nkql+7cK6P3&`(#&3sgE5iwS!D%NLuW!&SEWW>Ao2iUGard45w`!BG$X|pBc z_MQHu2cIPo9&VA*FH{#q0P%>JF;o6BZqWh-%17b}F1Y7l00xq*g3b#) zao5!>t_m9;5AI|>W7m8YkTx`TpgYC3#+g#LP<`>YYBFHwpXYfDAjwe z9h|#>{s$X@REE$@F=_RO$X?L&Ae@y#bzdSvaph6?V z!Cc+sS5BleCX}}Esp-Kg{X(Sqa|(h@rvyy`@M<%;h9$NlK=R00kPXas+J8jpdt}m3X zuF$1T{VT|;UA>EI`RUU~l6oxz)y~pCbnFj52A`Rj-)TN5{N5f^>BP$h06IeNE5gxa zWyM01C7t+yf2&O^n@-^Ggxs`3On?rzpluGxvH5_kB)v-n09N81&&QDZn%cDjHUm3z=)R?H6onB5*ms2JDkj9p^^Qjk=c_PFvkx)uEq z2#`7S-t++g6e~7_`*3W2K?*+tfbog9;gj`i^i*pAAZ4N2`m1KDP(uJL<*HC}^DMb? zKgI^=C{q+D;b7D@D{Q?2Z?f^tWwT+`0stu@WsJ?JTcl-&dgv`Ewke*9*`6cw<^wD0 z2pt^2l)28{ny&|}-(rL7ox-2|!WZ?-H}W^XiD)hDrm~0@<_umWYZJu7& zfJ`#&I$=!izxHM5K*A+WTkYzjpcMW0ctCSra>;ozJnhYQIshQ=mKC~+pWlw<0DyN@ zL}YjQA{Ol*v4dCcMBZ1BWVx0_h)%p~&;bC#CC=0bZ|Nuie~iL=uM>(2vvP;p%3SF; z0pMF%XvOUj4g>kXzft|fOqCMGhYLtOnoyv;?N-kfH1-w|2LaYlBqt2eNS!e^Y(VOc z>KilLe3n`2UJ<~5d3-~RJ{Skc#LBP!WQU~2d=Z2U1AzKgZtF~DTC#aMd;qX5C6XVD zPLNB&0_K?I=kG}$IMZ-zVFPo*AgJrM=F|)?-uf7Tx3fm2ofk+jz+(IR5|u5gVsmN| zMWY4xN1&h^gN!&lmSqtkMI^l&&%t>dW>`*q-pLG0NISR%lo)P8KdBfb+ZXZy&Ggi= z6CFNYw-P4>w#G34{Ztj9-ywWO_g?Qh8Q=oywzyCbTZjxm1Y+w7$0u60BI&W`m;`{H z!8a-c!}E-#oEbf!W{wq60ap z0~qKg$Q5`bgbSGBH2(ZkI1F()T56We$nNz=g{`S%s0NP!bQv2HOr7_{~c9szVrV=mGgdX&i z_5i?JR4ZRHX{XW5LpobGmx zUd=YVp9q*tzx=+FVUxhku&^*8Y2{2zY3+^ST?Gp0YGyi(e( zFv4amSRJ%!>AolKzLVjg^ug)=fFZUB8FoVG(BFHaCUk8$J!WhNA))#!`9_ts3d#3; z+(q@bl#}rTV<#+3iH`T4tC!cE?AC}z1`B$DF{aqC5&~MVKvdOEQ}>=5Siwzk9&t(J zA@Qpo*8Vcxc3u&BIg>xs!c3hXnu@-G((qGlj7@a*cBL%@&Bh0TyS&9lk4w z7)5(SEz>)IdiY_R>cp#eQdZPzmb&TY$ym_upF_E&j5?3xT}-%&oI!(-Voq*TT^`5E z*cLh|=U5F%Q-CN_L;4Qa`97!R7_j*MtlYo`Ox@W6T99GoCi9-ne8-LVA`V|_eY!Gu08ut7hq1WzWvK}1QUIBW5gQv!vxnK6cq0`FvOsygGtF1y zSMm1}P4d8Vw87Li`;;hbE{sGb99aJD!1H5FV8A40VE@%umjx^?4Dyz+8!u)qRotq# z0m;bMlT4Y9?}ovC-~Es^Y7Y7EQD$R&1Z|vlCKa)CrNr$BqmtZa366H zI>vLXT%XYZz}?j-@5H-s^oIhkQpt0byO?;U{_QW^sIvsku}O*B^i^?g8gRI6WW1UC z>ZKH=`yyR;cb>Zf8049*@}(w9e&g8Or$~I#b>CB>>e7rT;br8f0bv_=fxF)nz{7(uDI8!9IQfPLo>dGQ zVNd`-Kq#Ys)O$r6Azuoi0jn}VfzVL3Ziy)2LVQCcg3PF-rtSe&?AAAqP?giw&**X` zjb7H6lrbJ*?7PiC)^j)NJ8s{WmL144D~u*wT6)@_93p<_EUsSVPpvG&^)j~1u|Y82 zjkNhzXyZa7RZS+bS@hO5tB%-v^&qN4*n$@KR(Xatiw$oec}*qXW*{T>;eJeAvd*{> zfwuErhr@8(ae@Deeu+e=n^_aTjua6{IHOx?GO-_Dc()Mm71m>tCIU;NdtD${bE;_q zlt-PXJGUVO$a>lzmCYS}OYYW3xPb9G<4M+4gcr5McJ0qG#c#u31ysoPGJ+V|J3ZYZ z?!$n&+C}@X9r_TZEO)L0pL+v@8C8LPz#QwzW}iz0ks6tiaWZ_q%D;oIWq!S^R|zl; zP;Fh?`$(N{vlZ!U#HM=WefqQ2@yAnK!iTc>Ks@}uT;QEFWGPL?quI`aWs8~8t98yk zcMPfD0wQ7!Lms0ohHMgHTdXJLut})sgqy&T%s#}wmLFd8Y&H`~E$5mvssjZtKcibH za2^7g_5$k!&-1=Jj0rl8jwgmMU;xN6MU#OiX~`vhTiod1TRNEjPpIDBZuz<~hW9)V zm1(xw(D>Mflo;r0f+4g|+J{}Klj7)~v}2NO+P7wolLM@Ihh`(oD_+kZw!eOB-~#a~ zpn+0i0>XToW+CMDg}%^lZ~+{kZ0=1dndJ*-L{qmjDlN!p%QDsf zyvVG{c?bublbn&#dL~3&i8m$|p9dEkED-~6E2&ToY{tgo6V8Q2QUkWTgkCHv!n^?q>+FOW!X-aYIzy+HJ$Hd~9 zcQ?=UeQdKw)&|hI<5M07AoGx~o4~#;N@7NQBkr+M%Puarm2OybO1L{XJS-ut+jz%>XxjWcjWgE7U6CrZ3EO@_R0v*MnHP? z-w%0g#Gm-TD8l9ihL>1-Bz(2LoHxC#Ym zH*8$q4PSq3j!KA=FjMn<~HIa-+!isbf zY|)1}$`Y%KRxQetIR$`x=JyZD!Qg(!>#fg@TU|eQurz=f_gp?H%Q}#Z2wyYV z3i3>~rF%g7r;fjq!*<3%r^D8%Ts%3rR=fjn*liLe^h`UgsnHtJ-@xsKqc@j=2i*&a zwJ{@XSe=zL(eQ&deKSIJ_-pNnWu(AM#}wuh%EB0_Q??ULR)8cY4PPb8ThvSKYvCb?cJqD`_u-lM z;5jQ%m=CE!lrv*}cwR(4o1=eu(++MKPrd_pm$$rlxp?eL#jIDUfk1N~F`5n+zhAAvNKHlNG zL~UF49>x8nfv}TjH7{1wY%2&tK2YWVe%6S`1I4_?Ut#q6Yq`Q~rl+9vxMA;xb*$sB z99Ke9x}4M%v)8UPy1ZA?cSQSBAXI8lp>hJ)Yhm%-dl7n#dwBrSZdknf{8#ZW<3=_E z)dtC$y*tZY$Wi6_i)2=Wl!7~Fv-oKOF|fmWI`lksn9t7Mix`6}J}1P%!hFJ>uHEoi4Nc~L9$;)kBCriJ8o&EZ zJ^EN7ZN_Q0*&35imlDucPfF2T{p`rwQ5f2c9ne?aQa_&Pk)GpvC)2cshq@f)=b2(p z#W!VMxR~&xL)nfpJ{q!?*CC`7&c884ffzxOop5Vd(!rc@&oqNv9Bb~(RZd7b?k987 zW&PY(<+~2cSQBXu+pQpYxqJKQlE!8B&~SuiN<*S?etLT{A60kcy}GUWfYvZF9o(!z znpShaPxERSxa_txgh^%{fLV0$`0c4RqU~*a*uFOPl8sEg@Lsv-kZ_}xsiU|Qs@Ubr zj@pF>jS+#pYd^LSMw;FWH>7V~#9rO>X~i2?Z?JZ+2fGf#IRxmh2{?3yhhCbERbJGU z^e=mK8sVZy(BpYVu?`GpHt1hJ>t>PE@uJhDvFYt4N2XZ(I*Yc z2ohDzhYWOL;g?~$J64;Wx2zxT?#_92R~7q47~5F zCEVs>(_b$?cb{DTrFq9eh}ymcEUUaLJ!)xSfxn~F&m&<_4i#9!w%u+G{88YKq(n|F zr7iYtc#P&t8td1Lzhy#BDD)OUqN$KG`(+(&;J$t{dN4_tq$Et@Y5AKn{w3=^T^E5b zHY>Xk%eBp)mg*Jv`IN&G2I#6aP78ZAMnC@aNkWFBMqLi9ig9&$r>y&A$H-t7;vWvn z21d$p;}?VufA&r=^fW#yb{Rqbc1d-66x@XpaMf~v!iL#CB*0DjG$;FDpl*SwqL$nu zR9_)9X5G)iS;CS~Y#fRC4i*-9F+E$M6#dNBYq*I&L=KEaEZDtMEhS|3$_lp5Q)R9; zCvy@KI$dOuj$K{|fiDg1x39LvXfk!>qfSddgv;}P`xu`)JoxDO+wil;=`X8_@~(s^ z^3rF%T*>82I@&(9S;PGjd)VP)DJi~U+_Ez0L0N3F`srHS|IW!)UL!dx z{js!1Qp$xvURA^3-AC@+sry%k5oVH3!y98fj=C$HJ+oxcw#*nMo}h$7)2@z1i1xSm zh`2Ent=Ur5lAQg0^my)+-LLr1qU702A{N0+NL5zK#)$4p-_iuqJh`9M_%B2RQXi@E zv7g75GF$f~7)xZim+iT|xC={7O^CJ;r^&>z(HE(DGq*fI+@hI~%n$bq9rnvmMn#w< zW;Kp}>1|mu+{M#{WXZtS^|xDkJpx*?n`NJy(ZSjp$))?Cx>38DWoez5vFxje7%u8w z7I(I713bg*kC;h=@$q0DgM)wj9!6!)@tAoEp^KSjVJi(8YKnL>TEzP)p)sD>yQoe7 zp11Otu@-R-@-4_0hqydUHf!SdzA2F4@)ZPboIe#jO}S0WoY+?>MxYt+X=!3!^z^00 z9pj0>zyuW`B7_J|+ zDjDBPreI-1&}1gEAkRvAEACCZ^9LsA>9<4W+n)^%c;P2wWAVnv{1$b8Im?!bp|S-) zP4>#T=G>;^ll9I=q5p5EAU(3IU=V>p258)=VSkIo}D85<$<1bpSNQJrN!beK=@VoA3Wnv zVaHo4W7#}O;lypjoAec(K$f63B1bEdxqG8|j%hU_$bhWdT3OiUd+COd$A=D1!e)*N z3f)+Nom}A~;~>M=35zA8vR}LeKIcP&+13Xqr>wt&G%p#f1$GFGJK{H!?G+y`W|uvD zofx-KQ?0X9^U?2S@RtnzO!t*V%@PoMj-)#-QSlyj$`RC8#{o?-QRpiH_yZi%0UT7H zd4=u795Q8ocj%3m5hLY=6@ghE;=`{iZK-zh&$+8x+Dqj1`h3ia=iF=sMUA`zW0I9L z1eu!rk9~q;-iwgLhbd}O^nN8mNrugwR`-?|7K=YHB`UgA>K-(8T=rwn&7mieG0rJ@ zbj5#mRpxkFiNpd9C@bpCt@+dOR})Xil+4OTKciY2*CqY1Y+E3Pp_H(wP)%Ip>l%l} zb5f_EIbNQdPrWI?NwI^7gBwhFa5if{j5FQWw9D$ZiPVT-cpt6(M5KSsNMh-V&E=Zk zrAc(se_>nOU-pIkxsm`;gS}RCcD;qx9gp08?ahV*;p*k89UDx!(Kq7>(bQ!q$pktF zCikdpcEM9xO-X|%-fk&`z4JW?NJoXv{(bR-_=$ZJy@8e}#{4k4j?Y*7lYvtkzlu6B z!jN(B?*)UDl4pHPk=Q)Z6=5VFM>MXYzfIZj5q=UEH6<0cy?Wun=@#v&Im(?y6h0># z9Z%AhT_uQ9xl{b3aH}ShlsuJf)iLL%VMS6{aFGi;6Dmr&uXwLNqi$+rWP}0oE`HI7 zk|T0iHNdi`F~M*ohQOE+RU@sufviiI2okk|akvp7fG@>MRW#jc+1%f|J|5D8ns^K` zzxJea!8w0&Y+^26cc|e1F%?vnNxYhBNBeHQLUhZgr<4Ru=mfr9ZX zotK!iYgEso=`7kJT(dmf^QVfR+2diV{|N3IVerT?3b-x1k@tmBv6s-S3eGl8E|3&g zj)=rc`~nlR;0ru)WK-%~Y2=aKqgPzcso4#A)l>09@ZQS8M809_S8h?_ifugQj8uHe9>=^kWKi;nxB%V@^%4_|6Awa@a8@S z7mAUP4vpySfI-)eKa4JHNw*iOUm$fGb!qU`C)V6I9qP4vG@FDIB8ps281s87l1h!o z-$Y%0+>YhSWR!o{Zk8P;_Or%$OUU_vv3tK~@f%(_`sph|a||ZMhIXGq^YY~q26HDa z>K9<5#?;+yj^q4n&_cV4^Bfb0w%yTT`DjFAWRRC7LNPj~yQ50+tbXSqMOE4_TEK#W z$&JA7N|T#1<*>Dd{oP)=}-@%J2hcfDTlj3N!Rx#%zL~1Ei2|nrl+<)h5hP}fgW%w4ql+) z6Hhk!?k$y3olRu{#Y+|5)~gOvd2HE3CAZL^!SyU^cWZ9^xVKc1Wby&uT#vU%Jc9Cg zT5a&>k+7S{t7t}9iPP^#MKOJ?QxZZs8Gw1tk8I5{k86v~H}$EAw7q=QWJblP(VQqY z>fLY)mQ4sdC`Ms?WDT)LBH$#b#nLj!s~GJlcHe>00>1K4xjc-JTvoNWGeSH<`eB%` z2wvR)Zww<_e~vj@Cp)D&B{MA-1EyS5uMssFO?&u6`uVrCdDwFiW<|}NmKNrDr(%9- zo-qr}cbp8bv0osEGt)e&c=X8z=89Wy#yQAfQ%w>dd-=|Iz+C%==4}sZ$x!y_lfXfS zkMJllIMSm^*sxwodd4r~MEJz#(r6_vvMoyUmQTwE2Yv#nQYZB;TXyeuCWX|Dd}ogG z{IO0g^U4K?_wzbuo5*x=rv&VC?qK#Uu%@o3mp6LFRKa311j6W*qMh}h|9}daKUjel zDECAJsOe|Mlz4OMsX)^F!y^(H6*04?30TEBFgy`&l4oIN8c3WxQ1-VHohq=xIAieZDM)w9RVlV+GB$f1vS1i!pQII zh<+=@le#_V&-B7%7i>2CHS5XL|Q^^}^59QMpOYpL6N$0E|>5#4K^cydeZ_|s##x;)TQyE`-&NgZ! zC)L_*XjG>jP0Kb7-S>l1J#jKhX+ev*)OGbqFcLgiW{sI!{y4Wx;bSRSMUntsM;F@& zj1Da>A9THzX!DwKUf%kqRG_@Uw&L`dpgd)5HSL(tbOa;OGF85f#qx?!uam8s26uRs zk;((i7UexuJ? z8S_l8{d!BmxY%tqdn$%5E{VJL&&TdFBjFv4WP*MxUR2M)*r{p^?8m4yro&D==8koO zH1?1uVZYFBl3yo@Z&f&v_V4^{5gR5V*23C;?nGV!*ph#Em33{Ebz!f1z^&y0I*Llq zDv}YmSnr;WsV$|n>a+|MDm=A)nl!er7I2)B?elR}egv()rKWL&_^mvp7{%pJt$ts2 zqu(CC$*{><)SkBP^@QvWiyS1P;@>H)QdGyh_@RH`pDJD@r9rKk{>p1*TB9RFrN<3l zwR`|V8K5kV$}#4CLZl=z&0Bq!yUbAcwM2Cg&3D|0tl66L-8SbYIHN2zcqxL4HEUJRHc5A5Z;n&e{ zOd-+btTq1M%AM?CXS4_D!t12u&N8hlwe=Oh?5>&u+BWCCeGJFMk5`J1`+Upn&kTN+ zkJl>_56&EH_7VcLiQ}MLdT*!WOZV)$*59d5ua?i~Gdq?G!wA)% zli!OEt77iIt`lk4rZ$Lb$hSB|a}$)`SsAZ=Hj4`(#5F+fQ8<4ad{`sTJAIEY+-^o< z`f*o6nHj$xOU-3)VcusV@d`2TD$r|&vbw+gsNQ%AJe_xR5Bbq3cAI&2ZT~cPg4~mo z*v><2{7*|sko7am(k9@rw6JUN68|-MId~3*1?4Tl-_uBWT5k#NiObKLjL4oy+$j6!3MHryq4Gyx*mDe$b9mjkAjvTOzCHhcgOR)2#@ls#gsIpLb28Rh=YknoKo8 zTI)HU>e94VI2PN#%*pD7P@HO3(#pCsVW{}1%3pRksz8j|WYYn8rm~OSJ=cKcZ*k{F zht@sIh>^k7@>N;Hi`hEP@RZLF{e=n85?W)gdtdf74|4?(MNuSPvcfZbnCn6$5|$Aj z?o{#NaoPA@W{z+QgBin@A;3FEC%d1%Ys8E?nemZaU%l~;df^B^iOUi^!1PAz<6rDeuf~P3L;JU+F&mgOR zdxj`MJAOvEoY2Wvm?rn484(gJ7T^AA6AMbp97K^gf+)FuO9?t!xHanxH@&Zn0OzOI zwY^79%U8aAHp{$H2981jVLq7yEd8IqSf`oto4i8QzGzG8(5*3ULnS1LOT<9lH&Cf= zXFahn-FWPV+l@2YAaQ$B8*TD15~;gsKdNM-T6db@p{$d;E%ya&HDa+xAw~cp#7^?M zZM#W)WUKE+;=|nAxpL^bR9YAaH}d>y@mOc(bbNH&owMQ4n$w~l8%jzal<*tp>MCGK z+v77lt+Nhpptwz4S&e*pa=i7(Rs3qFq2!uAq0wJJ4S4#9>Z9R`)0oO$(jOZ3X8Vz{ zOwW`O@2VBmSF<7$gE?Zlh?_Ug)Pt#vb*(PYg2q}{KH;5wq1D*O7&>3_CTD)^!In9L zJk{vf{67vx9Fn`zJpH=5S5hXIo=fvK2cfiCV(?7Y4j9U{8FJX6Jqm{}6XTh(exKqg z7;^suvcvYqNL(*Mwk)LPGgp0$^t|)k zdP!5%nsOYP??r-5NG2{h)RX1#kqLvm$6cf7c=wv{a~CWtVCY3}VEaUev(nR~;)uGg zJy_h`0Rq#z&3n(B)Dk9|XPbV#VmVU=wa9mCwO$&uF_(6MC?63jV?%cWr9nh^)_)?p z95I{TX|KkuZ$2r99%DmKT8@Ue^Q`IoTa;=OSLZN#|0McI(}_0Zb60E~+I|&#;PMER zq9g{p;bD>Mr+%k?#4i*w{?7nspR?m6?&4f*Lyg+9l^6zu(Bm+A<)d2G<95Qn9)_f_ zaWbrm!=z8eP!h$=+uNrSJ9n$VF z1d}$a8?4q;J^NB&x##{#Ov^)!ta;ZQQ*f*+@QcV7ADnd1noeT6I6poBE!-o2gaK6N zx(AWemc%*tVD*>Xa5}%%4_8C;vn2539wSll;XwXDuiW6i_>M@veT>pF;@%x*?{qgL zj#y;UdI#ce^9P>$NP{@@0%h$*G0u-m%Y6eXi1!qc64VEA@)#K8#7tRD32gBn z3>u-~MVxa)yLCS<7JF?5RvP8a?79UJ>`2Lr`4LyM@JIfViBpT=D!T7}xK zIoL}AY^XVVkOoT@=ef>?h3Ua&Ux{HZ@qRM@tDb$IidZr%sQHvegZ>$8ZBjw_Nez+Q zZ@33dPoObph75ptA1g{rXN4mYsjW(jX`B;;f?Ywn2*4qi&tPg6=?72asb`-AKJO>K3%(&U9EWF0hj^He z^q`85YW{z!`|7`_q9*QLS{g;Vq(Ne-B?S}#sijN0rMsj-5v8Pa35f;ikd8$fmXK}` zq@}yx>+?MCAMpP8?hpI9_wG4!=FF+NGvArj@Vx-MP?3113*bC-^mo*$J?Q!j!Ov9P zdCBK|F~r*O@=%>_XOth@QgLxi8$J@uXMsc2kT=n0|L7;pX8_6W=0r?cKV359>WiHtZ*D#;Fn%zIG*$aKo-`=zON(U*zEpw*Q*TWI zCP>!SG6aML*9I%!gYiLtNmyitNncIk6Xf4DSzZ!mcMUeyRwsug&=C-OxMm+Px*$l2 z4q4y7P@IC9ezZ}J6wG5>pz^;Opx{}-YJeXQ=wEpdfLQ>6a7_Cm1orhGr;LH4xq#oq zJQz%|?L0UDX3)jjL5Zuc$b0}mMvkR(AKFWHPy#<5Yw$CfSj&(vas7Jyj{uwj;lx0w z{_Opj(o;)H?*b&4;+?r0bB{&3*1N~b4A&+;302D;HVcXc`rzjIVYw3DJCB% zDbzkK4_VAwrHP_Z9OIkr`(eytLXy8m15E}u0mrIZwUKppZ{k%0T_2iWX%g2~H$E@| zT542MU}aV+ebA1J=h5?_0JpVaCSoMgF}aN`N0muP7GyCS3c^aw{zqcLoDJ`bU4CJD zS+y?^u7*GIjPB}(H;$<^i%@lO;du0I^lpe7(VY01%XvqZl|LFqUsHU#W=wRVqMtuo zHHgfnfNn}sAeN{<^m`G*pdXe>RQu@xBjeKPJ2@iLy6O;d5wb(rnEH$CmCe&eu^jFF zm!QtQq&h{rz*xo#ao4x8mLASMfuf&ZVi>OrGA_jv2bXT5#hv$Q-(vVnDQK`_Qs#2yf64{IW*V_0i+Lz%s#vhN~O_llMP)0rdJqjRz?J^h)f8(p4q4z4=71dh5wd27BwE;(|6i2UtN zgGL@Js!iGUyJ-hY^4$%qyB`ymRhR=1L!iK_ARA{ASjNvuucQFW;{W>lMN!G3=+0GI z>6<=;|6hqO?^8pH?LROfl*USZ_qai zu+2EqK;Z!N?d`0s0IC8GoLzn-uB_B#n|{X zAw!;0`#wQV?3PvLyV(x^G)5MyXxE?lrWnMsG`At|Dy2xOf5`>f$2n3g3ws0{I|!*; zn3uI&g!nJ{*`^9kj^zx}W!vs~Mx5Xe4p=qpgRSfP8(!w#V(C%wP@a`6tIsq@>86`k z=r4mAo|Bx135zDDl*yx?K$Ati<+PXXI1h@aHk>@YGKYKFhZB#!+lWkK9m&ylY0|sf zoFfyL^)_0D->;LL+36v^D|^!c0LkB4)oAsi!x)O_q8)i>^MObc9h&!97G3!Hb`GLOKSjwma>SsE>l;$uH zW@J@G`oIrdIR#U>WG5r+p?9n zW(*INHuOp7pt!|S-1_ie=e*^u7gsdAORwA4gLGn$!BFg$1`^)|%9;}jCiC`MhF)=x z;a4{7y|s8slx|xQkBpOB{YzD;?ao;$pe#+&uiG(M>l7U%D%yQEDYcp9+)fshw=@P@ z#O*>;eKR?OBA4I6?cwm|$iC-&FFY71wrnGcM^yd@Bq_BAdkp+)i5%q^@YO61upAe~ z{bcYb%MA`U_6?}kZwkC$u(5m6Cco?YDoABj$qzbuK`Ay%T0dQHA*e0mSUX!Ja6?#R zx^!RpJV2~9Ms>J(q-x4L#9zM4S?r#-_t#ez@&4NJbS2R^<3-<^e1{hA@xYW@O#yY^ zt!95^d2i;BUYdRJkOlx(~o?InH8g1Z@%LbE_$V! z!Ti#N{Lf}Ob)b5|LXNy(a!e0p&!+S1eaCdWQ^X_M{P?k+62UiV&Pgm*_gDb~AJ;8x z(LLNVHB#p2$AUt1XSxo@cN+gXL%Q4QV5LTWxsNvloPk6ewwoN&;zU!q36{>vi?B{(@{9|Ua%UHNkNZmS zOfiv!!d)gC9pGn8xzs4zKWuduB}{HTA;~&Y>qr_lYV76CSogo$s(jMpu+$zY>6x=` z@#(^BL$7>^)qi=4DLjLEnr|8-MXQRdqyNo_FHYmaH?9I_4`aXXtYQQ20oh-+!e+ku z7q<>r1OqheFFw287*Rt%M}(amk}DyxTS`6_0j$*Y7R1p;yP(-j_CcjE2~?7K4Up^(fe_xq}`~* zj$}4T0cLfX>#Q)NpCkPRFB!RuGONcV2x$>i7k!!9V&*M7VmxJ=4(IEjj3_CCeYaVY zDeoL!&&pvppQyLWr}4>#9TTO_pX>+JJ@RHe>Dyaks%@-dnwEbroRo@fNovrO^98Hz zfVQ>5`qYw&5xnkctDe?@*AdH!ei2l(MY8!P<%VS#+#1%l0%o&fj|L+&$^Q>?(Te{kxPkflD_7m32`{LG7Jba_YRf# zQO2<1_$1Jn+WNO%j%du`pe;R0k%ax*ZwHTjGWt=6W|9HxKRwr+dZK%Qg~8o_BSZ4} z|G0VU3CI}KA6;$&sD@M(YLW{8yqyOYyZ?_dvi*T5oa1u0_XwFZ_GVqXf0+IXu74{(h~+{o{$qX;o)q zR3UZ07+1q0>0^JTc;@v6(Lngs=HR039|3z|n?@kYU~*oTZhX5@ON=VaMksCM=O4j2 z!1f-n&3ppY$3D|T3$739m`rF-eG~hWU+?kT|>z1Lab`rP}A2H8mxpHKL17nQN^{_EARB&(%oB*8zq&kdDBygk^(J9)~D(vGx(29hxx4FgSLY4%t>)`lyZfd zVmZ;)TbT1v1HaUtwcjXK<&tJ}y^@r-N*kBNLd4M;n*kMCah3W3Z%=dCZgajL;;PZr z1)9R|Avu1Ky;fWenJ>D^q{eg?QyI|Og{<2ObZ(+-rIMHxt_lH-&oeb3?K!=@o+Pvj zboO~wW?v(D+_0xbGs^K#O=)oFb5!HNIBZinKHukxf4kTwLNlgA$XwF!p|B?`NaI@h zTG+Bo!JkA}AZR}`IgawIsS&&{?$`!dg%9A^w+(dMtq zHS&%dV`m3#DQ9)~IGaekIMbLF8a);Xs)wG2$YV!nxP zvcG}bH3{Tw$XS)^bKjUT_XqGcdNF&aC!`QcuWOaNWmT~sJ$5IvOSnYKvb^_OuXQ#n zNo}n)+iq4iA0BWk(_~YKe6vfr+m#4zkaO2~-3q_?E&2S8#%@m{nSZPXcaT`o^{pP| z!p?$%#-#1=@QnVJ1R*z~*#w5$S1}Rd_)>7zyGv33@ehG?T~BoPZw|n2;F5k47}Eoq zU2%JRZ7_g2T^u$~{)Jk!eZmIwF2+9EoKMld!Eq-g|60Sq6mIU~kQp+U9Yh9(*Q*-;{f$bP6A#1FhZkDp`RWsVUDThZ7x!wS*;LN7BesD?HBqw9#o!Q#1s@yHY?~WPW9Yso{7U{6b8ih_=}b`&TR}o1SEQ!O?DqneQtj!UhSiSe@}

VARFsLnWC``6U}{P5KR&>W$`HC-e76IFq<`!q3X<;Cdjr-96vvctOr$7aK@ zbzhMrCfQ>7Ks++B5FA&o%3fmKlJ-O}HQL{kgfTg{Zl)`*k%RotpPF?%Y0*T=g>!>&s5)b$27ltY|<2sE8AZ=9mi|eEuBggIs9D%egg4ngst&e-XX*v&L z^3mV@QVQ|sv|}`L)!LY6lNvy95|GBA z`ay{0(7lwOH>qjDx-~~a%tLf!N=allLf;PA__*s0uuw97nJU?duYpppvSwKc=jm(m z39r8&dM7H%6?x+gBF?}}esVnn1Y_<^{bGpuZ!Ew7l%9+>K5w+h66>q26<0MJ`0`UQ z40_F7;T|eytRapgin&`HQVZk}&_Ph(1+kOnu z)~z%D%?p|uE`_`Kio(#9ZdyvKQ*vXWt;Ro~S{3$_vJNvw8rzheLR?$&7+K?2vm=fe zTkPRd!3-MQ+w+XKT7>peE1PUSA_$2HLfGj;a?! z#A|i&^-qVTOr))p?KIC2-kb%dyi`p=B$z%aEG625IQ`VJCK%!SA(J>i@gVj$GA{oy z%vX%rr^>cHrlZDvmYC$+Y&_u!!%Fc)0JEr5&Wo7eJUR*aROij50uDj*=z;Vdct2*D z+lZn|4YC}~&Gh@lI0?4wam`~G?;74Id}sNQ)|MeODFm7Di{mc}e72j+;yJ#OR+@Kw z!RVdxX9^_USd^3)Ur?q-f0ak z;t#)chzm?qE_6qNZ%Yu>^9H)JN!8M@i-{hAweU#OjUl43F~JHd$G^+y6SA*Gw-ffqGKC;Hw$B<80A z;a3CV8t!+8d6OZ@*2qwK8giekN*-^Mt%7)_nD8y{JS!Q-@V0 z;F%T2-Y6>uFT_jS5)$}QNBv?8l1SH;IN_>KaCoCQW^~?IItfNfqhqpFmrHQS=NX4v zB$y2lDpOmjxL5qF8XR-GXfzZ(O|=B%GtF98MVhZrswP}b$Vz1H);_*kCuWUU<0c?& zNSuh1B5Mh5<6e7HY-sS56GhgQ_};5kTJEjl%Hu`p&fJ?0uerBxlE;>Se1gOXGdM%y z>3&i}<3pjebp!+Kal;=wfzca+d;{N$oj>_lsnhR+VZ*YCez$9i3{2g787E5k08%wI zz(h<95;cLBLand6EaRyI#kYyDVrAKn^lcljf7Hv%`(h85w^!z`H;1P$`=5n2iqeX{ zfg)nd$s@{dC&rzQKX*gKLNr7NyXv0eN0ni}|1|wwhGL=kqKdA$vg(l=wHC#z>N_?CyEZJNXrlqGF# z>GQvtL_$h-s@o0e20O0+=&}K=S~j11_bxNmb#j`cNRrdAoKMjUhKvDphYn@B!A{t> zCCZNp;N{*QJC1d2!#q(%ytAzMf5>ezE=OCo20h;Q_RRezT#Kb3@Z{9(_wG6EuNbei zvHl)?X@mD}=2O#rxAmRv!jCqIaW8HtQibww(^t3ih~?s{;3M7(SY5R7??OUp zE?>=6^}q5%BR_wkWVq!QF-rj+f(UT!>MbeXey!oz>Al_Ujf4mknv|Lt4knW$WZG%8yk`Mnw1l$%<;s1nV|Z&bSztXs$LTCd|@Nu7Bm@ zmM9MW+7^=%lxF8kp7{Iky)bwoZ0a(9=^Q(vY7%9!eXcJ8w`^MRYHTMgo_5kH z1pcu3nJYYwo#Spa^F3QDJHcf}8>(?GTq{TZeHXQFet&ieJyYFf56_Nv6(bUAAW<(yraw;Z2ES zi7fx^3)@L^YMpe{T!^Q`@o_2RgVH5N|B0;}C;h&XsY1x0m0mFfQH5m@&P9K%VI}Yg zV!<|bI_+r(4k zMuR`9nD#0ufiG;aOzh($+Gv`FS~INxCl&BsdJ>nXEs*PGwtWTIl#(X$K(rMtsOqtv zhL9*MI3Ie-P(&CsW^ky}HIWapxg-!^7hv~`5qpm8ao5bLP7vFnonvKiQf?`0(OR7u z8DGkEF#wPMnfHB+r__!Le)e9%WV=2ts33?VLSG9}NkDzC;Qe^*;AH-zerM)fRc4IR zR19~GQSG#FC&?FhX~~>nJ?Z?ZUL9+pD86|~ULUPEamFJo%Q=WrYR2PGazC-%l=sK* zbH^@}khxmB$}@ssar0w=^ZV3Y+TL-<`f%jB^OqaRzwZ6xbzkdt7xKPmDTKx~2qo&y zJ~9_UiWM5f_|VC9q%38@y9NE|7EAz*;eD{=UQXv1{;f>o_uEl)KtTP})ts_k3{P{l zX{_;`PE~eDhFjxT>hY6uo2(Bq@A^~~X<7POWp`W~r|fxDHP!MMBKpNC08dC`;7eywg0I68~T*A5~XhGDG+?v@~d;)SeBswKWF!9lB{qXU?!(IG_lYUaZ_btw6bF@+a{+^8Rj zjnU=JY+9Z`T>2E9zB>G?DEc1&i{{{Ez0`A8YhSuK=RNI}{gSN)zkue#c^OX^YKHb~ z7OHGkm5C)oteMPuEF$}ET1b#8+7&(7^P*)QKdZj!TT5M0BD!_0tf2xePTQ#H8!1oB z@={Db<&XRFNcOY2Ac4$z2cT7F$TLZcr|%cG4EVS=+i_7Dm9C_rk7o3trvV7 zaqWhi2;FLef$5630tK?p_BX<2!e78bQ?_?8!l)OZeVP+ril(ji9hMHRJP*RRJ&lda z=OP+(M{VvqAaIkxV3fk^7X7}>fSEgY??EZ|A3I~}(`1Hs?n$FrsNakGKxCgkhq%w| z)2StYlhlykUh@$?&wuZ>J-2R{vMdK#Klx97Qq>RRy*{3RmEosk4MQPa3C~x3`7UbA z2fEmDp*Q5yV)H5W9>LdBgSsKTC+V9@>v{GMffXw)5(i7JaQTi%uW-WxLNuwUk2H{) zFSQ)Hn=p6XU3Pnpv~|lfFSm^*;n(Rw#lz4{D&1Yb0DV1Q(gqjZ442a5vW44z(mKE+ zA)uk$Oc_Ld{sYI@VGiNIv8QF=R|vfDt;qQ@cEE^b!gZ>awSq{*?u0P7ZOohtS6G%; zy0^VfJ>&J3VD`g_iCE2NXz@i|3H=y+gU|DJ`)&jUw!7C+2s7>A6_y99C^UPGy~l%w zNq1*IhG_)!p3d=U?WfGEi$(L>5ZSAcS7lq-l8(sQ;X2AB>JmCe{;l6(tWLx=Mc`MB z{;JU*(Pb+c9+#6Au$}WfSW-y6Gimb7wVDA@$cQL$1!)i6Iz}<%JmXLAVCcBC)#Z8H z(Zk@UH{1+W^phcQ3=iQAufFX6loUgqCi1jP%?y3~HKp*#CTVHrhZkHFhoM#1D%zPn z4Ux6c(p2Og}=q9 z$f2*lQ9iER*;ya=5wov}tWM4|^2)yAd^79Op711P6pDOxeO#eJ{A2OpN+|Z{U%WN3 z9Wg0q(#zT9=)1nAU(lZS!ZFU57bS9kuiE{##826cX^NY-LK91%`K`a(J4e?8sY#s4 zE@$b1z(}YWS965VS2bvC9&$ESwjv`h0;^=gN@(~3dz9=QUg`iRSvBuqRez!eY(A8l zr$bV@Y3=7cr~^-Pt_uxDx)ffM$AJJpzrs5GA&*np(6kI9(<}*eV>~;D*`(w>bN66H z&P=JFEBM*Re*L)wXB_zZu)MTrQSZv~j^e({Yc!1*#)`uI;KHAq_h{%dD`}>RytO=S zy79~L=a58|Hx)=zsD7Auu$Q#;VjOGHE4)I5&t1-oe%YHeGy~NRarL+#Z>A0y=V@_0 zu?b$I&MWK>o&{IToo}zMVv5Zh3P3M?@2`@b`|x(r(LblgOU)#^VoL@0Mvd)lk?>Y# z*mC;69^fR9{2Trqh5HcfZpMamj@G>IH00%$!?K)*gV&zjLsD#4_Aw$-XEd?ZzKp37 zdBK@v7C9=1Zg!fI=`GD@rMoUi(JN`|X%Xxx#Vg2eiKn!RpOWUlj;EiaXP8t3WNYHSDEmtJWW0$DYdJ;OGvq=0)PhuegCYg^MV;*Pdtfr&N~sf4 z5ANGKKk>q30Y+=$>24GkEXQphF`{*Q`M`4&oTD+ky13B z4AeEQexzE?3xfwaxCfnSXBLJ=aQyxe!Bl{(jMC^0t6 z4Rv{)f2{KN2W#mrnmzQ3g>RTK?=-gSCj29kH)O&@w)kRh(mUOkYflCweQ2Y>G3(Vo z_cq7Q6R?aNt^#LwVdJt`J}(LJ^`c4E-r>8j3(eL7Xt5u_{`(zj zW)<_?G*K+JNdylQwm>{T^`!C+exoQtwYL2WS)P#LxIw|vQqE=kW^C_t!8(R-X`Oe` zt0H1td8KA9>*>j1s5p5c8ZY0Ro5mU?uUeKXp$o>zq%IUv!dy{*Y3S%gsNrSHcgD`?f7rF;R^D;uctc zR}dbEFCu05b6Pjs=E&cR2yuc3VfW8c635sh3Ad9zVDal%<{;Q{Ec1hnNDk$=hxn@p z@qT&&?8-1H)+-x2L%fqi92EOvw~ccYd8Iu4I9#U=LaJ7#&@|qi*#Z7L-)^%uaax1@ zaQvzTgnh``UlD**CcxIR%G{?pYmX#fkq$g@D9|N>N)L+YJ{4&p+8&NJp~PjMVqc6I z!E?u6!MXP|SRV>b>os}>dWMceZtCMSE+yd?K=L5izZ`|^;zWF?5=F-7fJ zP1d~t&nR}x?(&$CX0suEh<=xE3DOqmSMM38Q!BoVfRcCZgl9FfcoTO`>Xf?RAR61n zJV@DF@N(p?f;KK>oQmZtoB1JQpY$Z|85j$qd%0QSfpD2&?|`UWPy5bEOfLxWyQI6= zUQwmYdm#F`*WN+!<1Sc81#=q|kBfgzODw`4hjcpG@Z9vjjGRxCy~g_h4!%o+N;K-J zLJD=zXG&b#6IL`kEAUFdAev$J#ccv^RcP{d`7*lK{4;Zd)gn9)%@CI(ee_l0{#@!{ zLky8|HM-?gySJp!eN(j<2ze9@NcpV)`H4s?ep_u>L-qKYiMXW0bI?i`kFs*0_xrRw z{?WybQCQ4Kh5bAhh$5!y^MJFl_jb&HNIRUY9y6xyHqg(-X%ZA{<|eptq$$UYIz;&18slA+_9M`o z;11g=t86npym5yO5gM*C?jes$w#3FWh7CIJwY3Vx>R9#QGd(n>L|X~bmpTe`WIx}r zkjgc3@-OgTlhk9qo30PHJP1_qZrUAC1EFF03>0NUZcW(A}%xF!-%!NgkZqoNcqcR=NHXUc&A`6Iu50D z1JGCLKG6Kl%%{PHe;p_04|V}#;0J#Nf52lf1s2uB4(npsR0RR%f9~)_4SCj9{edQu z(!1R)bFk>g;H<2MATYY%6aE?>PO-#4l8YSXDMuhhpiFf7`LmNHHKb{PvUlw|XROo$ zDe!cN645MYO*Y++R9iUx%`Z#}h`?uoz2Dc64IOd%Z1Goe!9M{bPgK@U`9+dr18Pxg zf_y;5zm^l9P_i^hAp!2f|L$o<1!D$~&5i2^ik|^|^n&H1!!4Hg@V&E^&f>>*g>A7+~kJHsqDTa}xe$WhDZT zQY-%@hw`@VL54C>gjJ6&*Q2WBFF&INFVhC^hKn* zmBp5pcHshr-U-?D1dNsj?U4<4;v{K5ymh#x0tg~?e_AJw$UD(g%{u2V6o|)H-#6V0 z0o(<@NRvF8jLR6=Ke+;bjlADIepv|u2ACpPp(zi#%Ou>t$t1RGlJY*KHY3`Em+tC} z6Ly9M^gzq;`ZZ`ht^Gk`H;*A!U*gi{B^m%G{r~Ut>D-QCxwj!`?`CAmJnhduWqYqn zRDvOE8mq6s;lbI)$Dn(${)KC7P&=Orzu$(0XX2l|&+W_eHi5~@j{fI(snM8sOJeOw zCf5qbV`}&VT>T?S4^bBiJmIx`B_bbssppko2R!{hy}S>0FTT#a?1OF)plqDuGwbi~ zQ(3P5{5aX=>F@IDd>`;_8pSA8q%_~q(Ux4-I$6H#MOLZOQd_%64S$aufr<3r%wD&V zSKPFQNWM`$Xz4h+wjq(pTbesZvt8>Ow3u&8kVA2x$+gh%m@6}@+QfUQBCcndV|waa zr-F!ixkY`uXv9ss2F;}2>=8&AwQyZqjdCW4yvAoRp8JyOiw9;hj*;n#AWsmfyhmSM zA8mV~-9_`Uhua;wwqBMC`Ha3R6+sRY*p#TrDO3Aj z%Nw-?DA5tFch5;z*KraWd||f}#>9NVqI>F_nYt_(QJCE3U3jXZSDJn=yFeSY@@D~r z9s;yA}EOYMwMH*@H-Eg ztfzxXc-Q5l`u~g{t(s{wV!g(%eacIhpeEh*_c3I(+ft*&v=yBMef49W9I6hGjWTOD z-jp~oAA_;T`!0tkkkx|y{(PU`GrNlm5C(J5(U?|OCcOaA-eNh`{VZ?yV88!!b7DS5 ztQZF_W0+mrpw|HlKxXIMs57BYKc2o6jct8uDT0Coy_a9njoWx&aUU=ac18LWiyD{C zGh&4wnn}jm|6HLss*bQ}c}mMS@`&Pz#1SJP3>P0K2QV-Of1jBu9|ARAb_GX(kA9jm zZlEw!TO!D@sQ7flL&BLh<{-hqsPN)ZJZWjVV>f0;Ud*m(`sG3nR=)+#q}U&6D0dld zQp)GhcG>AU|wF@hGV=L`Mre1XhR|#tZ{ZXu*)bs-v zNg6NM)*3H6$)SWi`By68K2VST-lEx+xhH{tRwfR=CECI<6n zxu+EOCJW*ET$`BJ%E6@KX5?BW(q=fMhbZu%C8)L7)M3e)>%jl}is!WIX(4)+*2wke zsoY-krAs*n^?tE{zIBo9WdWh6@o&dxS8h3LlH48=y?1_t*N%7Pu6It84c{97<_>dF zJgwXFoARqYr8iHrk*IQ)CTv9Grdp0r=~hZ05rB3Gbj+JOJ3iWN53UMypYzowIMSHc z3s7_m(79B+F|%OQ8N=TEgnk?3y!O73D)7Gjj|k#{`-W)yM)AC@LrJR3=;4v%&8L;R z>_NIUk1hT9Y)-w6ym<5tM4}(Ow~c9|E<0+ez_Gw&;?Af^^I5Fbtm?gU!2Pcr+4fEl zyFveSOZldA3DZ5C{LVk`T>%!#`woqLwvAD)W#V^p=EikFbKo5bU+os>;t)Rw_LLWq z+CHQdn>vQJA=#g;G4gSzSuVO9Tgp>Hf{f)*=U%EJsmSQeI#- zBKOM)z@vICr1X3R7$`AgyMU!mCAOH=0)H8abawN1zu=y9BJmmE;pHz;sTye3$xZY+ z7xAC-e2STj3n@2#y(W<HHjwdlH!ED!9*mWZEvlj0@ju=~f(>=} zKeduVMhj)Fs4a{7wd1G5!80}5vQoX#!C=J=F)ll|tUs-0APf{a!Lc#}7+u<)b$j># zix^@8?SrQP2lCXa6$v5+Osa$gjeaTi0knnQ%Sv#tkrDHib_)nlF(|5SoAP*qo*(*| z==+1(FQT7;a(=!`%lCV;J|7ws^HKH@aDLA$R)nvtIIY2KL9uSQlpy@UfTle8|5W#H zQ5G2e)C=gPAnE@QB<2MPxsgcGOcCATNw-K-OHzh*x?x~}V4Hb9U>{0(Q_zZ_(~VD~ z04Zo>69*p@K@MmX|NCCW1k@l9_F3dVMnP(!L<+;*fU;6N%f9F+WD|~-bLmsXjs1?< zbeP&0I<2t8dNqqpAM69pi3(0D5P$qD-8bHw`hm+FG$3eu>W-f?8|(}J^R3FtE$Po4 zQ#&v4`~=$nstZ*61fp;(BlF}!KmMTo@IV$H*zt%Vm<@{F6m91|S67rlur5#p5Nadf zQgK5NR7N7;Yuy6elb|!0;zp*remtt4S#5v92V%sMMv@ZqO+R(KoIW~!Klot$-+C6E zL8N&e^-6)QYY*~Wpi3@vVhHpWJ2pl3!ToPo1o%)NYxILN?($(pe5Osa{TsW_fc+oP zk;FiWPx;1~y~8bt%c+dW25T?=zQ8~!w7`z-w)|j1RPQ5?>kK3cdpn&;Hx2OMj%~7;ba3YZTK_Vc450aIO z!n*;t5kpeKfuj#n6wwI92dQGCJewj{iiOQt!@)W12V+K3kC05@1P{M$S>;wTfP{#q z4Cv4U8Iju5O2!B3rldv-UiR}LuhxG!Yt1c&nio=@pST78z8NH&kJ}~;T3-FU+`ii$ zd(>kYl2g1gA9a@OF_b94r;T$`dH*+`{8fr_n;TIe%5!pa9J$1*c!ILATmV4 zBpn}&4c$&ywZ3zsnfW3dJ!e0d1ULb63cphJx{rXq9}el>-<3Y>cvQbh#lQb-vIf+lL z`v<8!pC}+UyfZ7}@Y4MVS24=bFOec}{2VjWTuGmE4DjJ; za7MoaDtnD;Gy@e$nZ8zrJB{x}lR~7G$P@x8FE6l>bZBybUK9M*kOD0j`rm7+($&zw zVC(7*T-w z1fu!bM+~B2GkuMMtaZzN2U(>7O{`)5=Qog=7BYh32+*Eo|KA@NTy1;^y5Q>P>?b1Y zoN~GWwqm$%d^fTUp>1Vxy9R1UXm{_Pn&&J}4~Pig1zcV2Xzkq0ZEE-(j8B$;{hzEc vZz{;i-}}kW-f{FsKB-WH-h)7b>knX(t>UDpmE`Mx3cZ3X>~;Ao)6o9|08hFM literal 0 HcmV?d00001 diff --git a/Docs/tuto_A_add_vehicle.md b/Docs/tuto_A_add_vehicle.md deleted file mode 100644 index 635290f37..000000000 --- a/Docs/tuto_A_add_vehicle.md +++ /dev/null @@ -1,333 +0,0 @@ -# Add a new vehicle - -This tutorial details how to add a new vehicle to CARLA. There are two sections, one for 4 wheeled vehicles and one for 2 wheeled vehicles. There is an outline of the basic requirements that must be fulfilled when modeling your vehicle to ensure that it works well in CARLA and instructions on configurations required after the vehicle has been imported into Unreal Engine. - -* [__Add a 4 wheeled vehicle__](#add-a-4-wheeled-vehicle) - * [Bind and model the vehicle](#bind-and-model-the-vehicle) - * [Import and configure the vehicle](#import-and-configure-the-vehicle) -* [__Add a 2 wheeled vehicle__](#add-a-2-wheeled-vehicle) - -!!! Important - This tutorial only applies to users that work with a build from source, and have access to the Unreal Engine Editor. - ---- -## Add a 4 wheeled vehicle - -Vehicles added to CARLA need to use a __common base skeleton__ which is found [__here__](https://carla-assets.s3.eu-west-3.amazonaws.com/fbx/VehicleSkeleton.rar). This link will download a folder called `VehicleSkeleton.rar` which contains the base skeleton in two different `.fbx` formats, one in ASCII and the other in binary. The format you use will depend on your 3D modeling software requirements. - -__The positions of the skeleton bones can be changed but any other manipulation such as rotation, addition of new bones, or changing the current hierarchy will lead to errors. __ - ---- - -### Bind and model the vehicle - -This section details the minimum requirements in the modeling stage of your vehicle to make sure it can be used successfully in CARLA. The process involves binding the skeleton correctly to the base and wheels of the vehicle, creating Physical Asset and raycast sensor meshes, and exporting to the correct format. - -__1. Import the base skeleton.__ - -Import the base skeleton into your preferred 3D modeling software. Common editors include Maya and Blender. - -__2. Bind the bones.__ - -Bind the bones to the corresponding portions of the vehicle mesh according to the nomenclature below. Make sure to center the wheels' bones within the mesh. - -* __Front left wheel:__ `Wheel_Front_Left` -* __Front right wheel:__ `Wheel_Front_Right` -* __Rear left wheel:__ `Wheel_Rear_Left` -* __Rear right wheel:__ `Wheel_Rear_Right` -* __Rest of the mesh:__ `VehicleBase` - -!!! Warning - Do not make any changes to the bone names or the hierarchy nor add any new bones. - -__3. Model your vehicle.__ - -Vehicles should have between approximately 50,000 - 100,000 tris. We model the vehicles using the size and scale of actual cars. - -We recommend that you divide the vehicle into the following materials: - ->1. __Bodywork__: The metallic part of the vehicle. This material is changed to Unreal Engine material. Logos and details can be added but, to be visible, they must be painted in a different color by using the alpha channels in the Unreal Engine editor. -- __Glass_Ext__: A layer of glass that allows visibility from the outside to the inside of the vehicle. -- __Glass_Int__: A layer of glass that allows visibility from the inside to the outside of the vehicle. -- __Lights__: Headlights, indicator lights, etc. -- __LightGlass_Ext__: A layer of glass that allows visibility from the outside to the inside of the light. -- __LightGlass_Int__: A layer of glass that allows visibility from the inside to the outside of the light. -- __LicensePlate__: A rectangular plane of 29x12 cm. You can use the CARLA provided `.fbx` for best results, download it [here](https://carla-assets.s3.eu-west-3.amazonaws.com/fbx/LicensePlate.rar). The texture will be assigned automatically in Unreal Engine. -- __Interior__: Any other details that don't fit in the above sections can go into _Interior_. - -Materials should be named using the format `M_CarPart_CarName`, e.g., `M_Bodywork_Mustang`. - -Textures should be named using the format `T_CarPart_CarName`, e.g., `T_Bodywork_Mustang`. Textures should be sized as 2048x2048. - -Unreal Engine automatically creates LODs but you can also create them manually in your 3D editor. Tri counts are as follows: - -- __LOD 0__: 100,000 tris -- __LOD 1__: 80,000 tris -- __LOD 2__: 60,000 tris -- __LOD 3__: 30,000 tris - - -__4. Create the Physical Asset mesh.__ - -The Physical Asset mesh is an additional mesh that allows Unreal Engine to calculate the vehicle's physics. It should be as simple as possible, with a reduced number of polygons, and should cover the whole vehicle except for the wheels. See the image below for an example. - ->>![physical asset mesh](../img/physical_asset_mesh.png) - -The Physical Asset mesh should be exported as a separate `.fbx` file. The final file should fulfill the following requirements: - -- Have a base mesh. This should be a copy of the Physical Asset mesh. It should have the same name as the original vehicle. -- The Physical Asset mesh must be named using the format `UCX__`, __otherwise it will not be recognized by Unreal Engine.__ -- The mesh must not extend beyond the boundaries of the original model. -- The mesh should have the same position as the original model. - ->>![base mesh](../img/base_mesh.png) - -Export the final mesh as an `.fbx` file with the name `SMC_.fbx`. - -__5. Create the mesh for the raycast sensor.__ - -The raycast sensor mesh sets up the vehicle's shape that will be detected by the raycast sensors (RADAR, LiDAR, and Semantic LiDAR). This mesh should have a slightly more defined geometry than the Physical Asset mesh in order to increase the realism of sensor simulation but not as detailed as the car mesh for performance reasons. - -Consider the following points when creating the raycast sensor mesh: - -- The mesh should cover all aspects of the vehicle, including wheels, side mirrors, and grilles. -- The wheels should be cylinders of no more than 16 loops. -- Various meshes can be joined together if required. -- The mesh(es) must not extend beyond the boundaries of the original model. -- The mesh(es) should have the same position as the original. - ->>![collision mesh](../img/collision_mesh.png) - -Export the final mesh as an `.fbx` file with the name `SM_sc_.fbx`. - -__5. Export the vehicle mesh(es).__ - -Select all the main vehicle mesh(es) and the skeleton base and export as `.fbx`. - ---- - -### Import and configure the vehicle - -This section details the process of importing the vehicle into Unreal Engine for use in CARLA. Perform these steps in the Unreal Engine editor. - -__1. Create the vehicle folder.__ - -Create a new folder named `` in `Content/Carla/Static/Vehicles/4Wheeled`. - -__2. Import the `.fbx`.__ - -Inside the new vehicle folder, import your main vehicle skeleton `.fbx` by right-clicking in the **_Content Browser_** and selecting **_Import into Game/Carla/Static/Vehicles/4Wheeled/_**. - -In the dialogue box that pops up: - -- Set **_Import Content Type_** to `Geometry and Skinning Weights`. -- Set **_Normal Import Method_** to `Import Normals`. -- Optionally set **_Material Import Method_** to `Do not create materials`. Uncheck **_Import Textures_** to avoid Unreal Engine creating default materials. - -The Skeletal Mesh will appear along with two new files, `_PhysicsAssets` and `_Skeleton`. - -Import the rest of your `.fbx` files separately from the main vehicle skeleton `.fbx` file. - -__3. Set the physical asset mesh.__ - ->1. Open `_PhysicsAssets` from the **_Content Browser_**. -- Right-click on the `Vehicle_Base` mesh in the **_Skeleton Tree_** panel and go to **_Copy Collision from StaticMesh_**. -- Search for and select your `SMC_` file. You should see the outline of the physical asset mesh appear in the viewport. -- Delete the default capsule shape from the `Vehicle_Base`. -- Select all the wheels: - - Go to the **_Tools_** panel and change the **_Primitive Type_** to `Sphere`. - - Go to the **_Details_** panel and change **_Physics Type_** to `Kinematic`. - - Set **_Linear Damping_** to `0`. This will eliminate any extra friction on the wheels. -- Enable **_Simulation Generates Hit Event_** for all meshes. -- Click **_Re-generate Bodies_**. -- Adjust the wheel sphere to the size of the wheel. -- Save and close the window. - ->![Collision mesh](../img/collision_mesh_vehicle.png) - -__4. Create the Animation Blueprint.__ - ->1. In the **_Content Browser_**, right-click inside your vehicle folder and select **_Animation -> Animation Blueprint_**. -- In **_Parent Class_** search for and select `VehicleAnimInstance`. -- In **_Target Skeleton_** search for and select `_Skeleton`. -- Press **_OK_** and rename the blueprint as `AnimBP_`. - -__5. Configure the Animation Blueprint.__ - -To ease the process of configuring the animation blueprint, we will copy an existing one from a native CARLA vehicle: - ->1. Go to `Content/Carla/Static/Vehicle` and choose any CARLA vehicle folder. Open its Animation Blueprint. -- In the **_My Blueprint_** panel, double click on **_AnimGraph_**. You will see the graph come up in the viewport. -- Click and drag to select the **_Mesh Space Ref Pose_**, **_Wheel Handler_**, and **_Component To Local_** components. Right-click and select **_Copy_**. -- Go back to your own vehicle Animation Blueprint and paste the copied contents into the graph area. -- Click and drag from the standing figure in the **_Component To Local_** component to the figure in **_Output Pose_** to join the components together. -- Click **_Compile_** in the top left corner. You should now see a pulsating line flowing through the entire sequence. -- Save and close the window. - ->>![add_vehicle_step_04](img/add_vehicle_step_04.jpg) - -__6. Prepare the vehicle and wheel blueprints.__ - ->1. In the **_Content Browser_**, go to `Content/Carla/Blueprints/Vehicles` and create a new folder ``. -- Inside the folder, right-click and go to **_Blueprint Class_**. Open the **_All Classes_** section in the pop-up. -- Search for `BaseVehiclePawn` and press **_Select_**. -- Rename the file as `BP_`. -- Go to the folder of any of the native CARLA vehicles in `Carla/Blueprints/Vehicles`. From the **_Content Browser_**, copy the four wheel blueprints into the blueprint folder for your own vehicle. Rename the files to replace the old vehicle name with your own vehicle name. - ->>![Copy wheel blueprints](../img/copy_wheel_blueprint.png) - -__7. Configure the wheel blueprints.__ - ->1. In your vehicle blueprint folder, open all four of the wheel blueprints. -- In the **_Class Defaults_** panel, set **_Collision Mesh_** to `Wheel_Shape`. __Omitting this step will cause the vehicle wheels to sink into the ground__. -- Adjust the values for wheel shape radius, width, mass, and damping rate according to your vehicle specifications. -- Set **_Tire Config_** to `CommonTireConfig` -- On the front wheels set **_Steer Angle_** according to your preferences (default is `70`). Uncheck **_Affected by Handbrake_**. -- On the rear wheels set **_Steer Angle_** to `0`. Check **_Affected by Handbrake_**. -- When setting the suspension values, you can use the values [here](tuto_D_customize_vehicle_suspension.md) as a guide. -- Compile and save. - ->>![wheel shape](../img/wheel_shape.png) - -__8. Configure vehicle blueprint.__ - ->1. From the **_Content Browser_**, open your `BP_`. -- In the **_Components_** panel, select **_Mesh (VehicleMesh) (Inherited)_**. -- In the **_Details_** panel, go to **_Skeletal Mesh_** and search for and select the base skeleton file of your vehicle (located in the `Carla/Static/Vehicles/4Wheeled/` folder). -- Go to **_Anim Class_** in the **_Details_** panel. Search for and select your `AnimBP_` file. -- In the **_Components_** panel, select **_Custom Collision (Inherited)_**. -- Select **_Static Mesh_** in the **_Details_** panel and search for your `SM_sc_` raycast sensor mesh. -- In the **_Components_** panel, select **_VehicleMovement (MovementComp) (Inherited)_**. -- In the **_Details_** panel, search for `wheel`. You will find settings for each of the wheels. For each one, click on **_Wheel Class_** and search for the `BP__` file that corresponds to the correct wheel position. - ->>>>![wheel blueprint](../img/wheel_blueprint.png) - -If you have any additional meshes for your vehicle (doors, lights, etc.,) separate from the base mesh: - ->1. Drag them into the **_Mesh (VehicleMesh) (Inherited)_** hierarchy in the **_Components_** panel. -- Select the extra meshes in the hierarchy and search for `Collision` in the **_Details_** panel. -- Set **_Collision Presets_** to `NoCollision`. -- Select any lights meshes in the hierarchy. Search for `Tag` in the **_Details_** panel and add the tag `emissive`. - -Click **_Save_** and **_Compile_**. - - - -__9. Add the vehicle to the Blueprint Library__. - ->1. In `Content/Carla/Blueprint/Vehicle`, open the `VehicleFactory` file. -- In the **_Generate Definitions_** tab, double click **_Vehicles_**. -- In the **_Details_** panel, expand the **_Default Value_** section and add a new element to the vehicles array. -- Fill in the **_Make_** and **_Model_** of your vehicle. -- Fill in the **_Class_** value with your `BP_` file. -- Optionally, provide a set of recommended colors for the vehicle. -- Compile and save. - ->![vehicle factory](../img/vehicle_factory.png) - -__10. Test the vehicle__. - -Launch CARLA, open a terminal in `PythonAPI/examples` and run the following command: - -```sh -python3 manual_control.py --filter # The make or model defined in step 9 -``` - -!!! Note - Even if you used upper case characters in your make and model, they need to be converted to lower case when passed to the filter. - ---- - -## Add an N wheeled vehicle - -Adding an N wheeled vehicle follows the same import pipeline as that for 4 wheeled vehicles above with a few steps that are different. - -__5.__ __Configure the Animation Blueprint for an N wheeled vehicle__ - -Search for `BaseVehiclePawnNW` and press **_Select_**. - -![n_wheel_base](../img/base_nw.png) - -__6.__ __Prepare the vehicle and wheel blueprints__ - -Go to the folder of any native CARLA vehicles in Carla/Blueprints/Vehicles. From the Content Browser, copy the four wheel blueprints into the blueprint folder for your own vehicle. Rename the files to replace the old vehicle name with your own vehicle name. - -Copy the four wheels and copy again for additional wheels. In the case of a 6 wheeled vehicle, you will need 6 different wheels: FLW, FRW, MLW, MRW, RLW, RRW. - -![n_wheel_bps](../img/nwheels.png) - -__7.__ __Configure the wheel blueprints__ - -Follow section __7__ as above for the 4 wheeled vehicle. The key difference in the case of an N wheeled vehicle is those affected by handbrake and steering parameters. In some vehicles (like for example a long wheelbase truck) the front 2 pairs of wheels will steer, and one set may steer more than others. The rearmost pairs may be affected by handbrake, the specifics will depend upon the vehicle you are modelling. - -__8.__ __Configure vehicle blueprint__ - -In the Details panel, search for `wheel`. You will find settings for each of the wheels. For each one, click on Wheel Class and search for the BP__ file that corresponds to the correct wheel position. - -This is correct, but just to specify, in the case of N wheeled vehicles, you need to set ALL the wheels. This is an example with a 6 wheeled vehicle: - -![n_wheel_config](../img/nwheel_config.png) - - -Finally, an additional consideration is setting the differential. In the case of a 4 wheeled vehicle, we have different presets of differentials (Limited Slip, Open 4W etc.) but with N wheeled vehicles, you need to choose on which wheels you want to apply torque. In this case, we have chosen only the middle and rear wheels have torque, while the front wheels don’t, you can specify other configurations. The numbers are going to be the same as the image above this text (e.g. 0 will be the Front Left Wheel, as specified above). - -![n_wheel_mech](../img/nwheel_mech_setup.png) - -All other parameters such as engine, transmission, steering curve, are the same as 4 wheeled vehicles. - ---- -## Add a 2 wheeled vehicle - -Adding 2 wheeled vehicles is similar to adding a 4 wheeled one but due to the complexity of the animation you'll need to set up aditional bones to guide the driver's animation. [Here](https://carla-assets.s3.eu-west-3.amazonaws.com/fbx/BikeSkeleton.rar) is the link to the reference skeleton for 2 wheeled vehicles. - -As with the 4 wheeled vehicles, orient the model towards positive "x" and every bone axis towards -positive x and with the z axis facing upwards. - -```yaml -Bone Setup: - - Bike_Rig: # The origin point of the mesh. Place it in the point 0 of the scenecomment - - BikeBody: # The model's body centre. - - Pedals: # If the vehicle is a bike bind the pedalier to this bone, will rotate with the bike acceleration. - - RightPedal: # Sets the driver's feet position and rotates with the pedalier if the vehicle is a bike. - - LeftPedal: # ^ - - RearWheel: # Rear Wheel of the vehicle - - Handler: # Rotates with the frontal wheel of the vehicle bind the vehicle handler to it. - - HandlerMidBone: # Positioned over the front wheel bone to orient the handler with the wheel - - HandlerRight: # Sets the position of the driver's hand, no need to bind it to anything. - - HandlerLeft: # ^ - - Frontwheel: # Frontal wheel of the vehicle. - - RightHelperRotator: # This four additional bones are here for an obsolete system of making the bike stable by using aditional invisible wheels - - RightHelprWheel: # ^ - - LeftHelperRotator: # ^ - - LeftHelperWheel: # ^ - - Seat: # Sets the position of the drivers hip bone. No need to bind it to anything but place it carefully. -``` - -__1.__ Import fbx as Skelletal Mesh to its own folder inside `Content/Carla/Static/Vehicles/2Wheeled`. When importing select "General2WheeledVehicleSkeleton" as skelleton A Physics asset should be automatically created and linked. - -__2.__ Tune the Physics asset. Delete the automatically created ones and add boxes to the `BikeBody` bone trying to match the shape as possible, make sure generate hit events is enabled. - Add a sphere for each wheel and set their "Physics Type" to "Kinematic". - -__3.__ Create folder `Content/Blueprints/Vehicles/` - -__4.__ Inside that folder create two blueprint classes derived from "VehicleWheel" class. Call them `_FrontWheel` and `_RearWheel`. Set their "Shape Radius" to exactly match the mesh wheel radius (careful, radius not diameter). Set their "Tire Config" to "CommonTireConfig". On the front wheel uncheck "Affected by Handbrake" and on the rear wheel set "Steer Angle" to zero. - -__5.__ Inside the same folder create a blueprint class derived from `Base2WheeledVehicle` call it ``. Open it for edit and select component "Mesh", setup the "Skeletal Mesh" - and the "Anim Class" to the corresponding ones. Then select the VehicleBounds component and set the size to cover vehicle's area as seen from above. - -__6.__ Select component "VehicleMovement", under "Vehicle Setup" expand "Wheel Setups", setup each wheel. - -* __0:__ Wheel Class=`_FrontWheel`, Bone Name=`FrontWheel` -* __1:__ Wheel Class=`_FrontWheel`, Bone Name=`FrontWheel` -* __2:__ Wheel Class=`_RearWheel`, Bone Name=`RearWheel` -* __3:__ Wheel Class=`_RearWheel`, Bone Name=`RearWheel` -(You'll notice that we are basically placing two wheels in each bone. The vehicle class unreal provides does not support vehicles with wheel numbers different from 4 so we had to make it believe the vehicle has 4 wheels) - -__7.__ Select the variable "is bike" and tick it if your model is a bike. This will activate the - pedalier rotation. Leave unmarked if you are setting up a motorbike. - -__8.__ Find the variable back Rotation and set it as it fit better select the component SkeletalMesh - (The driver) and move it along x axis until its in the seat position. - -__9.__ Test it, go to CarlaGameMode blueprint and change "Default Pawn Class" to the newly - created bike blueprint. diff --git a/Docs/tuto_G_retrieve_data.md b/Docs/tuto_G_retrieve_data.md deleted file mode 100644 index bb38021a4..000000000 --- a/Docs/tuto_G_retrieve_data.md +++ /dev/null @@ -1,1343 +0,0 @@ -# Retrieve simulation data - -Learning an efficient way to retrieve simulation data is essential in CARLA. This holistic tutorial is advised for both, newcomers and more experienced users. It starts from the very beginning, and gradually dives into the many options available in CARLA. - -First, the simulation is initialized with custom settings and traffic. An ego vehicle is set to roam around the city, optionally with some basic sensors. The simulation is recorded, so that later it can be queried to find the highlights. After that, the original simulation is played back, and exploited to the limit. New sensors can be added to retrieve consistent data. The weather conditions can be changed. The recorder can even be used to test specific scenarios with different outputs. - -* [__Overview__](#overview) -* [__Set the simulation__](#set-the-simulation) - * [Map setting](#map-setting) - * [Weather setting](#weather-setting) -* [__Set traffic__](#set-traffic) - * [CARLA traffic and pedestrians](#carla-traffic-and-pedestrians) - * [SUMO co-simulation traffic](#sumo-co-simulation-traffic) -* [__Set the ego vehicle__](#set-the-ego-vehicle) - * [Spawn the ego vehicle](#spawn-the-ego-vehicle) - * [Place the spectator](#place-the-spectator) -* [__Set basic sensors__](#set-basic-sensors) - * [RGB camera](#rgb-camera) - * [Detectors](#detectors) - * [Other sensors](#other-sensors) -* [__Set advanced sensors__](#set-advanced-sensors) - * [Depth camera](#depth-camera) - * [Semantic segmentation camera](#semantic-segmentation-camera) - * [LIDAR raycast sensor](#lidar-raycast-sensor) - * [Radar sensor](#radar-sensor) -* [__No-rendering-mode__](#no-rendering-mode) - * [Simulate at a fast pace](#simulate-at-a-fast-pace) - * [Manual control without rendering](#manual-control-without-rendering) -* [__Record and retrieve data__](#record-and-retrieve-data) - * [Start recording](#start-recording) - * [Capture and record](#capture-and-record) - * [Stop recording](#stop-recording) -* [__Exploit the recording__](#exploit-the-recording) - * [Query the events](#query-the-events) - * [Choose a fragment](#choose-a-fragment) - * [Retrieve more data](#retrieve-more-data) - * [Change the weather](#change-the-weather) - * [Try new outcomes](#try-new-outcomes) -* [__Tutorial scripts__](#tutorial-scripts) - ---- -## Overview - -There are some common mistakes in the process of retrieving simulation data. Flooding the simulator with sensors, storing useless data, or struggling to find a specific event are some examples. However, some outlines to this process can be provided. The goal is to ensure that data can be retrieved and replicated, and the simulation can be examined and altered at will. - -!!! Note - This tutorial uses the [__CARLA 0.9.8 deb package__](start_quickstart.md). There may be minor changes depending on your CARLA version and installation, specially regarding paths. - -The tutorial presents a wide set of options for the differents steps. All along, different scripts will be mentioned. Not all of them will be used, it depends on the specific use cases. Most of them are already provided in CARLA for generic purposes. - -* __config.py__ changes the simulation settings. Map, rendering options, set a fixed time-step... - * `carla/PythonAPI/util/config.py` -* __dynamic_weather.py__ creates interesting weather conditions. - * `carla/PythonAPI/examples/dynamic_weather.py` -* __spawn_npc.py__ spawns some AI controlled vehicles and walkers. - * `carla/PythonAPI/examples/spawn_npc.py` -* __manual_control.py__ spawns an ego vehicle, and provides control over it. - * `carla/PythonAPI/examples/manual_control.py` - -However, there are two scripts mentioned along the tutorial that cannot be found in CARLA. They contain the fragments of code cited. This serves a twofold purpose. First of all, to encourage users to build their own scripts. It is important to have full understanding of what the code is doing. In addition to this, the tutorial is only an outline that may, and should, vary a lot depending on user preferences. These two scripts are just an example. - -* __tutorial_ego.py__ spawns an ego vehicle with some basic sensors, and enables autopilot. The spectator is placed at the spawning position. The recorder starts at the very beginning, and stops when the script is finished. -* __tutorial_replay.py__ reenacts the simulation that __tutorial_ego.py__ recorded. There are different fragments of code to query the recording, spawn some advanced sensors, change weather conditions, and reenact fragments of the recording. - -The full code can be found in the last section of the tutorial. Remember these are not strict, but meant to be customized. Retrieving data in CARLA is as powerful as users want it to be. - -!!! Important - This tutorial requires some knowledge of Python. - ---- -## Set the simulation - -The first thing to do is set the simulation ready to a desired environment. - -Run CARLA. - -```sh -cd /opt/carla/bin -./CarlaUE.sh -``` - -### Map setting - -Choose a map for the simulation to run. Take a look at the [map documentation](core_map.md#carla-maps) to learn more about their specific attributes. For the sake of this tutorial, __Town07__ is chosen. - -Open a new terminal. Change the map using the __config.py__ script. - -``` -cd /opt/carla/PythonAPI/utils -python3 config.py --map Town01 -``` -This script can enable different settings. Some of them will be mentioned during the tutorial, others will not. Hereunder there is a brief summary. - -

- Optional arguments in config.py - -```sh - -h, --help show this help message and exit - --host H IP of the host CARLA Simulator (default: localhost) - -p P, --port P TCP port of CARLA Simulator (default: 2000) - -d, --default set default settings - -m MAP, --map MAP load a new map, use --list to see available maps - -r, --reload-map reload current map - --delta-seconds S set fixed delta seconds, zero for variable frame rate - --fps N set fixed FPS, zero for variable FPS (similar to - --delta-seconds) - --rendering enable rendering - --no-rendering disable rendering - --no-sync disable synchronous mode - --weather WEATHER set weather preset, use --list to see available - presets - -i, --inspect inspect simulation - -l, --list list available options - -b FILTER, --list-blueprints FILTER - list available blueprints matching FILTER (use '*' to - list them all) - -x XODR_FILE_PATH, --xodr-path XODR_FILE_PATH - load a new map with a minimum physical road - representation of the provided OpenDRIVE -``` -
-
- -![tuto_map](img/tuto_map.jpg) -
Aerial view of Town07
- -### Weather setting - -Each town is loaded with a specific weather that fits it, however this can be set at will. There are two scripts that offer different approaches to the matter. The first one sets a dynamic weather that changes conditions over time. The other sets custom weather conditions. It is also possible to code weather conditions. This will be covered later when [changing weather conditions](#change-the-weather). - -* __To set a dynamic weather__. Open a new terminal and run __dynamic_weather.py__. This script allows to set the ratio at which the weather changes, being `1.0` the default setting. - -```sh -cd /opt/carla/PythonAPI/examples - -python3 dynamic_weather.py --speed 1.0 -``` - -* __To set custom conditions__. Use the script __environment.py__. There are quite a lot of possible settings. Take a look at the optional arguments, and the documentation for [carla.WeatherParameters](python_api.md#carla.WeatherParameters). - -```sh -cd /opt/carla/PythonAPI/util -python3 environment.py --clouds 100 --rain 80 --wetness 100 --puddles 60 --wind 80 --fog 50 - -``` -
- Optional arguments in environment.py - -```sh - -h, --help show this help message and exit - --host H IP of the host server (default: 127.0.0.1) - -p P, --port P TCP port to listen to (default: 2000) - --sun SUN Sun position presets [sunset | day | night] - --weather WEATHER Weather condition presets [clear | overcast | rain] - --altitude A, -alt A Sun altitude [-90.0, 90.0] - --azimuth A, -azm A Sun azimuth [0.0, 360.0] - --clouds C, -c C Clouds amount [0.0, 100.0] - --rain R, -r R Rain amount [0.0, 100.0] - --puddles Pd, -pd Pd Puddles amount [0.0, 100.0] - --wind W, -w W Wind intensity [0.0, 100.0] - --fog F, -f F Fog intensity [0.0, 100.0] - --fogdist Fd, -fd Fd Fog Distance [0.0, inf) - --wetness Wet, -wet Wet - Wetness intensity [0.0, 100.0] -``` -
-
- -![tuto_weather](img/tuto_weather.jpg) -
Weather changes applied
- ---- -## Set traffic - -Simulating traffic is one of the best ways to bring the map to life. It is also necessary to retrieve data for urban environments. There are different options to do so in CARLA. - -### CARLA traffic and pedestrians - -The CARLA traffic is managed by the [Traffic Manager](adv_traffic_manager.md) module. As for pedestrians, each of them has their own [carla.WalkerAIController](python_api.md#carla.WalkerAIController). - -Open a new terminal, and run __spawn_npc.py__ to spawn vehicles and walkers. Let's just spawn 50 vehicles and the same amount of walkers. - -```sh -cd /opt/carla/PythonAPI/examples -python3 spawn_npc.py -n 50 -w 50 --safe -``` -
- Optional arguments in spawn_npc.py - -```sh - -h, --help show this help message and exit - --host H IP of the host server (default: 127.0.0.1) - -p P, --port P TCP port to listen to (default: 2000) - -n N, --number-of-vehicles N - number of vehicles (default: 10) - -w W, --number-of-walkers W - number of walkers (default: 50) - --safe avoid spawning vehicles prone to accidents - --filterv PATTERN vehicles filter (default: "vehicle.*") - --filterw PATTERN pedestrians filter (default: "walker.pedestrian.*") - -tm_p P, --tm-port P port to communicate with TM (default: 8000) - --async Asynchronous mode execution -``` -
-
-![tuto_spawning](img/tuto_spawning.jpg) -
Vehicles spawned to simulate traffic.
- -### SUMO co-simulation traffic - -CARLA can run a co-simulation with SUMO. This allows for creating traffic in SUMO that will be propagated to CARLA. This co-simulation is bidirectional. Spawning vehicles in CARLA will do so in SUMO. Specific docs on this feature can be found [here](adv_sumo.md). - -This feature is available for CARLA 0.9.8 and later, in __Town01__, __Town04__, and __Town05__. The first one is the most stable. - -!!! Note - The co-simulation will enable synchronous mode in CARLA. Read the [documentation](adv_synchrony_timestep.md) to find out more about this. - -* First of all, install SUMO. -```sh -sudo add-apt-repository ppa:sumo/stable -sudo apt-get update -sudo apt-get install sumo sumo-tools sumo-doc -``` -* Set the environment variable SUMO_HOME. -```sh -echo "export SUMO_HOME=/usr/share/sumo" >> ~/.bashrc && source ~/.bashrc -``` -* With the CARLA server on, run the [SUMO-CARLA synchrony script](https://github.com/carla-simulator/carla/blob/master/Co-Simulation/Sumo/run_synchronization.py). -```sh -cd ~/carla/Co-Simulation/Sumo -python3 run_synchronization.py examples/Town01.sumocfg --sumo-gui -``` -* A SUMO window should have opened. __Press Play__ in order to start traffic in both simulations. -``` -> "Play" on SUMO window. -``` - -The traffic generated by this script is an example created by the CARLA team. By default it spawns the same vehicles following the same routes. These can be changed by the user in SUMO. - -![tuto_sumo](img/tuto_sumo.jpg) -
SUMO and CARLA co-simulating traffic.
- -!!! Warning - Right now, SUMO co-simulation is a beta feature. Vehicles do not have physics nor take into account CARLA traffic lights. - ---- -## Set the ego vehicle - -From now up to the moment the recorder is stopped, there will be some fragments of code belonging to __tutorial_ego.py__. This script spawns the ego vehicle, optionally some sensors, and records the simulation until the user finishes the script. - -### Spawn the ego vehicle - -Vehicles controlled by the user are commonly differenciated in CARLA by setting the attribute `role_name` to `ego`. Other attributes can be set, some with recommended values. - -Hereunder, a Tesla model is retrieved from the [blueprint library](bp_library.md), and spawned with a random recommended colour. One of the recommended spawn points by the map is chosen to place the ego vehicle. - -```py -# -------------- -# Spawn ego vehicle -# -------------- -ego_bp = world.get_blueprint_library().find('vehicle.tesla.model3') -ego_bp.set_attribute('role_name','ego') -print('\nEgo role_name is set') -ego_color = random.choice(ego_bp.get_attribute('color').recommended_values) -ego_bp.set_attribute('color',ego_color) -print('\nEgo color is set') - -spawn_points = world.get_map().get_spawn_points() -number_of_spawn_points = len(spawn_points) - -if 0 < number_of_spawn_points: - random.shuffle(spawn_points) - ego_transform = spawn_points[0] - ego_vehicle = world.spawn_actor(ego_bp,ego_transform) - print('\nEgo is spawned') -else: - logging.warning('Could not found any spawn points') -``` - -### Place the spectator - -The spectator actor controls the simulation view. Moving it via script is optional, but it may facilitate finding the ego vehicle. - -```py -# -------------- -# Spectator on ego position -# -------------- -spectator = world.get_spectator() -world_snapshot = world.wait_for_tick() -spectator.set_transform(ego_vehicle.get_transform()) -``` - ---- -## Set basic sensors - -The process to spawn any sensor is quite similar. - -__1.__ Use the library to find sensor blueprints. -__2.__ Set specific attributes for the sensor. This is crucial. Attributes will shape the data retrieved. -__3.__ Attach the sensor to the ego vehicle. __The transform is relative to its parent__. The [carla.AttachmentType](python_api.md#carlaattachmenttype) will determine how the position of the sensor is updated. -__4.__ Add a `listen()` method. This is the key element. A [__lambda__](https://www.w3schools.com/python/python_lambda.asp) method that will be called each time the sensor listens for data. The argument is the sensor data retrieved. - -Having this basic guideline in mind, let's set some basic sensors for the ego vehicle. - -### RGB camera - -The [RGB camera](ref_sensors.md#rgb-camera) generates realistic pictures of the scene. It is the sensor with more settable attributes of them all, but it is also a fundamental one. It should be understood as a real camera, with attributtes such as `focal_distance`, `shutter_speed` or `gamma` to determine how it would work internally. There is also a specific set of attributtes to define the lens distorsion, and lots of advanced attributes. For example, the `lens_circle_multiplier` can be used to achieve an effect similar to an eyefish lens. Learn more about them in the [documentation](ref_sensors.md#rgb-camera). - -For the sake of simplicity, the script only sets the most commonly used attributes of this sensor. - -* __`image_size_x` and `image_size_y`__ will change the resolution of the output image. -* __`fov`__ is the horizontal field of view of the camera. - -After setting the attributes, it is time to spawn the sensor. The script places the camera in the hood of the car, and pointing forward. It will capture the front view of the car. - -The data is retrieved as a [carla.Image](python_api.md#carla.Image) on every step. The listen method saves these to disk. The path can be altered at will. The name of each image is coded to be based on the simulation frame where the shot was taken. - -```py -# -------------- -# Spawn attached RGB camera -# -------------- -cam_bp = None -cam_bp = world.get_blueprint_library().find('sensor.camera.rgb') -cam_bp.set_attribute("image_size_x",str(1920)) -cam_bp.set_attribute("image_size_y",str(1080)) -cam_bp.set_attribute("fov",str(105)) -cam_location = carla.Location(2,0,1) -cam_rotation = carla.Rotation(0,180,0) -cam_transform = carla.Transform(cam_location,cam_rotation) -ego_cam = world.spawn_actor(cam_bp,cam_transform,attach_to=ego_vehicle, attachment_type=carla.AttachmentType.Rigid) -ego_cam.listen(lambda image: image.save_to_disk('tutorial/output/%.6d.jpg' % image.frame)) -``` -![tuto_rgb](img/tuto_rgb.jpg) -
RGB camera output
- -### Detectors - -These sensors retrieve data when the object they are attached to registers a specific event. There are three type of detector sensors, each one describing one type of event. - -* [__Collision detector.__](ref_sensors.md#collision-detector) Retrieves collisions between its parent and other actors. -* [__Lane invasion detector.__](ref_sensors.md#lane-invasion-detector) Registers when its parent crosses a lane marking. -* [__Obstacle detector.__](ref_sensors.md#obstacle-detector) Detects possible obstacles ahead of its parent. - -The data they retrieve will be helpful later when deciding which part of the simulation is going to be reenacted. In fact, the collisions can be explicitely queried using the recorder. This is prepared to be printed. - -Only the obstacle detector blueprint has attributes to be set. Here are some important ones. - -* __`sensor_tick`__ sets the sensor to retrieve data only after `x` seconds pass. It is a common attribute for sensors that retrieve data on every step. -* __`distance` and `hit-radius`__ shape the debug line used to detect obstacles ahead. -* __`only_dynamics`__ determines if static objects should be taken into account or not. By default, any object is considered. - -The script sets the obstacle detector to only consider dynamic objects. If the vehicle collides with any static object, it will be detected by the collision sensor. - -```py -# -------------- -# Add collision sensor to ego vehicle. -# -------------- - -col_bp = world.get_blueprint_library().find('sensor.other.collision') -col_location = carla.Location(0,0,0) -col_rotation = carla.Rotation(0,0,0) -col_transform = carla.Transform(col_location,col_rotation) -ego_col = world.spawn_actor(col_bp,col_transform,attach_to=ego_vehicle, attachment_type=carla.AttachmentType.Rigid) -def col_callback(colli): - print("Collision detected:\n"+str(colli)+'\n') -ego_col.listen(lambda colli: col_callback(colli)) - -# -------------- -# Add Lane invasion sensor to ego vehicle. -# -------------- - -lane_bp = world.get_blueprint_library().find('sensor.other.lane_invasion') -lane_location = carla.Location(0,0,0) -lane_rotation = carla.Rotation(0,0,0) -lane_transform = carla.Transform(lane_location,lane_rotation) -ego_lane = world.spawn_actor(lane_bp,lane_transform,attach_to=ego_vehicle, attachment_type=carla.AttachmentType.Rigid) -def lane_callback(lane): - print("Lane invasion detected:\n"+str(lane)+'\n') -ego_lane.listen(lambda lane: lane_callback(lane)) - -# -------------- -# Add Obstacle sensor to ego vehicle. -# -------------- - -obs_bp = world.get_blueprint_library().find('sensor.other.obstacle') -obs_bp.set_attribute("only_dynamics",str(True)) -obs_location = carla.Location(0,0,0) -obs_rotation = carla.Rotation(0,0,0) -obs_transform = carla.Transform(obs_location,obs_rotation) -ego_obs = world.spawn_actor(obs_bp,obs_transform,attach_to=ego_vehicle, attachment_type=carla.AttachmentType.Rigid) -def obs_callback(obs): - print("Obstacle detected:\n"+str(obs)+'\n') -ego_obs.listen(lambda obs: obs_callback(obs)) -``` -![tuto_detectors](img/tuto_detectors.jpg) -
Output for detector sensors
- -### Other sensors - -Only two sensors of this category will be considered for the time being. - -* [__GNSS sensor.__](ref_sensors.md#gnss-sensor) Retrieves the geolocation of the sensor. -* [__IMU sensor.__](ref_sensors.md#imu-sensor) Comprises an accelerometer, a gyroscope, and a compass. - -To get general measures for the vehicle object, these two sensors are spawned centered to it. - -The attributes available for these sensors mostly set the mean or standard deviation parameter in the noise model of the measure. This is useful to get more realistic measures. However, in __tutorial_ego.py__ only one attribute is set. - -* __`sensor_tick`__. As this measures are not supposed to vary significantly between steps, it is okay to retrieve the data every so often. In this case, it is set to be printed every three seconds. - -```py -# -------------- -# Add GNSS sensor to ego vehicle. -# -------------- - -gnss_bp = world.get_blueprint_library().find('sensor.other.gnss') -gnss_location = carla.Location(0,0,0) -gnss_rotation = carla.Rotation(0,0,0) -gnss_transform = carla.Transform(gnss_location,gnss_rotation) -gnss_bp.set_attribute("sensor_tick",str(3.0)) -ego_gnss = world.spawn_actor(gnss_bp,gnss_transform,attach_to=ego_vehicle, attachment_type=carla.AttachmentType.Rigid) -def gnss_callback(gnss): - print("GNSS measure:\n"+str(gnss)+'\n') -ego_gnss.listen(lambda gnss: gnss_callback(gnss)) - -# -------------- -# Add IMU sensor to ego vehicle. -# -------------- - -imu_bp = world.get_blueprint_library().find('sensor.other.imu') -imu_location = carla.Location(0,0,0) -imu_rotation = carla.Rotation(0,0,0) -imu_transform = carla.Transform(imu_location,imu_rotation) -imu_bp.set_attribute("sensor_tick",str(3.0)) -ego_imu = world.spawn_actor(imu_bp,imu_transform,attach_to=ego_vehicle, attachment_type=carla.AttachmentType.Rigid) -def imu_callback(imu): - print("IMU measure:\n"+str(imu)+'\n') -ego_imu.listen(lambda imu: imu_callback(imu)) -``` - -![tuto_other](img/tuto_other.jpg) -
GNSS and IMU sensors output
- ---- -## Set advanced sensors - -The script __tutorial_replay.py__, among other things, contains definitions for more sensors. They work in the same way as the basic ones, but their comprehension may be a bit harder. - -### Depth camera - -The [depth camera](ref_sensors.md#depth-camera) generates pictures of the scene that map every pixel in a grayscale depth map. However, the output is not straightforward. The depth buffer of the camera is mapped using a RGB color space. This has to be translated to grayscale to be comprehensible. - -In order to do this, simply save the image as with the RGB camera, but apply a [carla.ColorConverter](python_api.md#carla.ColorConverter) to it. There are two conversions available for depth cameras. - -* __carla.ColorConverter.Depth__ translates the original depth with milimetric precision. -* __carla.ColorConverter.LogarithmicDepth__ also has milimetric granularity, but provides better results in close distances and a little worse for further elements. - -The attributes for the depth camera only set elements previously stated in the RGB camera: `fov`, `image_size_x`, `image_size_y` and `sensor_tick`. The script sets this sensor to match the previous RGB camera used. - -```py -# -------------- -# Add a Depth camera to ego vehicle. -# -------------- -depth_cam = None -depth_bp = world.get_blueprint_library().find('sensor.camera.depth') -depth_location = carla.Location(2,0,1) -depth_rotation = carla.Rotation(0,180,0) -depth_transform = carla.Transform(depth_location,depth_rotation) -depth_cam = world.spawn_actor(depth_bp,depth_transform,attach_to=ego_vehicle, attachment_type=carla.AttachmentType.Rigid) -# This time, a color converter is applied to the image, to get the semantic segmentation view -depth_cam.listen(lambda image: image.save_to_disk('tutorial/new_depth_output/%.6d.jpg' % image.frame,carla.ColorConverter.LogarithmicDepth)) -``` - -![tuto_depths](img/tuto_depths.jpg) -
Depth camera output. Simple conversion on the left, logarithmic on the right.
- -### Semantic segmentation camera - -The [semantic segmentation camera](ref_sensors.md#semantic-segmentation-camera) renders elements in scene with a different color depending on how these have been tagged. The tags are created by the simulator depending on the path of the asset used for spawning. For example, meshes tagged as `Pedestrians` are spawned with content stored in `Unreal/CarlaUnreal/Content/Static/Pedestrians`. - -The output is an image, as any camera, but each pixel contains the tag encoded in the red channel. This original image must be converted using __ColorConverter.CityScapesPalette__. New tags can be created, read more in the [documentation](ref_sensors.md#semantic-segmentation-camera). - -The attributes available for this camera are exactly the same as the depth camera. The script also sets this to match the original RGB camera. - -```py -# -------------- -# Add a new semantic segmentation camera to my ego -# -------------- -sem_cam = None -sem_bp = world.get_blueprint_library().find('sensor.camera.semantic_segmentation') -sem_bp.set_attribute("image_size_x",str(1920)) -sem_bp.set_attribute("image_size_y",str(1080)) -sem_bp.set_attribute("fov",str(105)) -sem_location = carla.Location(2,0,1) -sem_rotation = carla.Rotation(0,180,0) -sem_transform = carla.Transform(sem_location,sem_rotation) -sem_cam = world.spawn_actor(sem_bp,sem_transform,attach_to=ego_vehicle, attachment_type=carla.AttachmentType.Rigid) -# This time, a color converter is applied to the image, to get the semantic segmentation view -sem_cam.listen(lambda image: image.save_to_disk('tutorial/new_sem_output/%.6d.jpg' % image.frame,carla.ColorConverter.CityScapesPalette)) -``` - -![tuto_sem](img/tuto_sem.jpg) -
Semantic segmentation camera output
- -### LIDAR raycast sensor - -The [LIDAR sensor](ref_sensors.md#lidar-raycast-sensor) simulates a rotating LIDAR. It creates a cloud of points that maps the scene in 3D. The LIDAR contains a set of lasers that rotate at a certain frequency. The lasers raycast the distance to impact, and store every shot as one single point. - -The way the array of lasers is disposed can be set using different sensor attributes. - -* __`upper_fov` and `lower_fov`__ the angle of the highest and the lowest laser respectively. -* __`channels`__ sets the amount of lasers to be used. These are distributed along the desired _fov_. - -Other attributes set the way this points are calculated. They determine the amount of points that each laser calculates every step: `points_per_second / (FPS * channels)`. - -* __`range`__ is the maximum distance to capture. -* __`points_per_second`__ is the amount of points that will be obtained every second. This quantity is divided between the amount of `channels`. -* __`rotation_frequency`__ is the amount of times the LIDAR will rotate every second. - -The point cloud output is described as a [carla.LidarMeasurement]. It can be iterated as a list of [carla.Location] or saved to a _.ply_ standart file format. - -```py -# -------------- -# Add a new LIDAR sensor to my ego -# -------------- -lidar_cam = None -lidar_bp = world.get_blueprint_library().find('sensor.lidar.ray_cast') -lidar_bp.set_attribute('channels',str(32)) -lidar_bp.set_attribute('points_per_second',str(90000)) -lidar_bp.set_attribute('rotation_frequency',str(40)) -lidar_bp.set_attribute('range',str(20)) -lidar_location = carla.Location(0,0,2) -lidar_rotation = carla.Rotation(0,0,0) -lidar_transform = carla.Transform(lidar_location,lidar_rotation) -lidar_sen = world.spawn_actor(lidar_bp,lidar_transform,attach_to=ego_vehicle) -lidar_sen.listen(lambda point_cloud: point_cloud.save_to_disk('tutorial/new_lidar_output/%.6d.ply' % point_cloud.frame)) -``` - -The _.ply_ output can be visualized using __Meshlab__. - -__1.__ Install [Meshlab](http://www.meshlab.net/#download). -```sh -sudo apt-get update -y -sudo apt-get install -y meshlab -``` -__2.__ Open Meshlab. -```sh -meshlab -``` -__3.__ Open one of the _.ply_ files. `File > Import mesh...` - -![tuto_lidar](img/tuto_lidar.jpg) -
LIDAR output after being processed in Meshlab.
- -### Radar sensor - -The [radar sensor](ref_sensors.md#radar-sensor) is similar to de LIDAR. It creates a conic view, and shoots lasers inside to raycast their impacts. The output is a [carla.RadarMeasurement](python_api.md#carlaradarmeasurement). It contains a list of the [carla.RadarDetection](python_api.md#carlaradardetection) retrieved by the lasers. These are not points in space, but detections with data regarding the sensor: `azimuth`, `altitude`, `sensor` and `velocity`. - -The attributes of this sensor mostly set the way the lasers are located. - -* __`horizontal_fov` and `vertical_fov`__ determine the amplitude of the conic view. -* __`channels`__ sets the amount of lasers to be used. These are distributed along the desired `fov`. -* __`range`__ is the maximum distance for the lasers to raycast. -* __`points_per_second`__ sets the the amount of points to be captured, that will be divided between the channels stated. - -The script places the sensor on the hood of the car, and rotated a bit upwards. That way, the output will map the front view of the car. The `horizontal_fov` is incremented, and the `vertical_fov` diminished. The area of interest is specially the height where vehicles and walkers usually move on. The `range` is also changed from 100m to 10m, in order to retrieve data only right ahead of the vehicle. - -The callback is a bit more complex this time, showing more of its capabilities. It will draw the points captured by the radar on the fly. The points will be colored depending on their velocity regarding the ego vehicle. - -* __Blue__ for points approaching the vehicle. -* __Read__ for points moving away from it. -* __White__ for points static regarding the ego vehicle. - -```py -# -------------- -# Add a new radar sensor to my ego -# -------------- -rad_cam = None -rad_bp = world.get_blueprint_library().find('sensor.other.radar') -rad_bp.set_attribute('horizontal_fov', str(35)) -rad_bp.set_attribute('vertical_fov', str(20)) -rad_bp.set_attribute('range', str(20)) -rad_location = carla.Location(x=2.0, z=1.0) -rad_rotation = carla.Rotation(pitch=5) -rad_transform = carla.Transform(rad_location,rad_rotation) -rad_ego = world.spawn_actor(rad_bp,rad_transform,attach_to=ego_vehicle, attachment_type=carla.AttachmentType.Rigid) -def rad_callback(radar_data): - velocity_range = 7.5 # m/s - current_rot = radar_data.transform.rotation - for detect in radar_data: - azi = math.degrees(detect.azimuth) - alt = math.degrees(detect.altitude) - # The 0.25 adjusts a bit the distance so the dots can - # be properly seen - fw_vec = carla.Vector3D(x=detect.depth - 0.25) - carla.Transform( - carla.Location(), - carla.Rotation( - pitch=current_rot.pitch + alt, - yaw=current_rot.yaw + azi, - roll=current_rot.roll)).transform(fw_vec) - - def clamp(min_v, max_v, value): - return max(min_v, min(value, max_v)) - - norm_velocity = detect.velocity / velocity_range # range [-1, 1] - r = int(clamp(0.0, 1.0, 1.0 - norm_velocity) * 255.0) - g = int(clamp(0.0, 1.0, 1.0 - abs(norm_velocity)) * 255.0) - b = int(abs(clamp(- 1.0, 0.0, - 1.0 - norm_velocity)) * 255.0) - world.debug.draw_point( - radar_data.transform.location + fw_vec, - size=0.075, - life_time=0.06, - persistent_lines=False, - color=carla.Color(r, g, b)) -rad_ego.listen(lambda radar_data: rad_callback(radar_data)) -``` - -![tuto_radar](img/tuto_radar.jpg) -
Radar output. The vehicle is stopped at a traffic light, so the static elements in front of it appear in white.
- ---- -## No-rendering mode - -The [no-rendering mode](adv_rendering_options.md) can be useful to run an initial simulation that will be later played again to retrieve data. Especially if this simulation has some extreme conditions, such as dense traffic. - -### Simulate at a fast pace - -Disabling the rendering will save up a lot of work to the simulation. As the GPU is not used, the server can work at full speed. This could be useful to simulate complex conditions at a fast pace. The best way to do so would be by setting a fixed time-step. Running an asynchronous server with a fixed time-step and no rendering, the only limitation for the simulation would be the inner logic of the server. - -The same `config.py` used to [set the map](#map-setting) can disable rendering, and set a fixed time-step. - -``` -cd /opt/carla/PythonAPI/utils -python3 config.py --no-rendering --delta-seconds 0.05 # Never greater than 0.1s -``` - -!!! Warning - Read the [documentation](adv_synchrony_timestep.md) before messing around with with synchrony and time-step. - -### Manual control without rendering - -The script `PythonAPI/examples/no_rendering_mode.py` provides an overview of the simulation. It creates a minimalistic aerial view with Pygame, that will follow the ego vehicle. This could be used along with __manual_control.py__ to generate a route with barely no cost, record it, and then play it back and exploit it to gather data. - -``` -cd /opt/carla/PythonAPI/examples -python3 manual_control.py -``` - -``` -cd /opt/carla/PythonAPI/examples -python3 no_rendering_mode.py --no-rendering -``` - -
- Optional arguments in no_rendering_mode.py - -```sh - -h, --help show this help message and exit - -v, --verbose print debug information - --host H IP of the host server (default: 127.0.0.1) - -p P, --port P TCP port to listen to (default: 2000) - --res WIDTHxHEIGHT window resolution (default: 1280x720) - --filter PATTERN actor filter (default: "vehicle.*") - --map TOWN start a new episode at the given TOWN - --no-rendering switch off server rendering - --show-triggers show trigger boxes of traffic signs - --show-connections show waypoint connections - --show-spawn-points show recommended spawn points -``` -
-
- -![tuto_no_rendering](img/tuto_no_rendering.jpg) -
no_rendering_mode.py working in Town07
- -!!! Note - In this mode, GPU-based sensors will retrieve empty data. Cameras are useless, but other sensors such as detectors will work properly. - ---- -## Record and retrieve data - -### Start recording - -The [__recorder__](adv_recorder.md) can be started at anytime. The script does it at the very beginning, in order to capture everything, including the spawning of the first actors. If no path is detailed, the log will be saved into `CarlaUnreal/Saved`. - -```py -# -------------- -# Start recording -# -------------- -client.start_recorder('~/tutorial/recorder/recording01.log') -``` - -### Capture and record - -There are many different ways to do this. Mostly it goes down as either let it roam around or control it manually. The data for the sensors spawned will be retrieved on the fly. Make sure to check it while recording, to make sure everything is set properly. - -* __Enable the autopilot.__ This will register the vehicle to the [Traffic Manager](adv_traffic_manager.md). It will roam around the city endlessly. The script does this, and creates a loop to prevent the script from finishing. The recording will go on until the user finishes the script. Alternatively, a timer could be set to finish the script after a certain time. - -```py -# -------------- -# Capture data -# -------------- -ego_vehicle.set_autopilot(True) -print('\nEgo autopilot enabled') - -while True: - world_snapshot = world.wait_for_tick() -``` - -* __Manual control.__ Run the script `PythonAPI/examples/manual_control.py` in a client, and the recorder in another one. Drive the ego vehicle around to create the desired route, and stop the recorder when finished. The __tutorial_ego.py__ script can be used to manage the recorder, but make sure to comment other fragments of code. - -``` -cd /opt/carla/PythonAPI/examples -python3 manual_control.py -``` - -!!! Note - To avoid rendering and save up computational cost, enable [__no rendering mode__](adv_rendering_options.md#no-rendering-mode). The script `/PythonAPI/examples/no_rendering_mode.py` does this while creating a simple aerial view. - -### Stop recording - -The stop call is even simpler than the start call was. When the recorder is done, the recording will be saved in the path stated previously. - -```py -# -------------- -# Stop recording -# -------------- -client.stop_recorder() -``` - ---- -## Exploit the recording - -So far, a simulation has been recorded. Now, it is time to examine the recording, find the most remarkable moments, and work with them. These steps are gathered in the script, __tutorial_replay.py__. The outline is structured in different segments of code commented. - -It is time to run a new simulation. - -```sh -./CarlaUnreal.sh -``` -To reenact the simulation, [choose a fragment](#choose-a-fragment) and run the script containing the code for the playback. - -```sh -python3 tuto_replay.py -``` - -### Query the events - -The different queries are detailed in the [__recorder documentation__](adv_recorder.md). In summary, they retrieve data for specific events or frames. Use the queries to study the recording. Find the spotlight moments, and trace what can be of interest. - -```py -# -------------- -# Query the recording -# -------------- -# Show only the most important events in the recording. -print(client.show_recorder_file_info("~/tutorial/recorder/recording01.log",False)) -# Show actors not moving 1 meter in 10 seconds. -print(client.show_recorder_actors_blocked("~/tutorial/recorder/recording01.log",10,1)) -# Filter collisions between vehicles 'v' and 'a' any other type of actor. -print(client.show_recorder_collisions("~/tutorial/recorder/recording01.log",'v','a')) -``` - -!!! Note - The recorder does not need to be on, in order to do the queries. - -![tuto_query_frames](img/tuto_query_frames.jpg) -
Query showing important events. This is the frame where the ego vehicle was spawned.
- -![tuto_query_blocked](img/tuto_query_blocked.jpg) -
Query showing actors blocked. In this simulation, the ego vehicle remained blocked for 100 seconds.
- -![tuto_query_collisions](img/tuto_query_collisions.jpg) -
Query showing a collision between the ego vehicle and an object of type "other".
- -!!! Note - Getting detailed file info for every frame can be overwhelming. Use it after other queries to know where to look at. - -### Choose a fragment - -After the queries, it may be a good idea play some moments of the simulation back, before messing around. It is very simple to do so, and it could be really helpful. Know more about the simulation. It is the best way to save time later. - -The method allows to choose the beginning and ending point of the playback, and an actor to follow. - -```py -# -------------- -# Reenact a fragment of the recording -# -------------- -client.replay_file("~/tutorial/recorder/recording01.log",45,10,0) -``` - -Here is a list of possible things to do now. - -* __Use the information from the queries.__ Find out the moment and the actors involved in an event, and play that again. Start the recorder a few seconds before the event. -* __Follow different actors.__ Different perspectives will show new events that are not included in the queries. -* __Rom around with a free spectator view.__ Set the `actor_id` to `0`, and get a general view of the simulation. Be wherever and whenever wanted thanks to the recording. - -!!! Note - When the recording stops, the simulation doesn't. Walkers will stand still, and vehicles will continue roaming around. This may happen either if the log ends, or the playback gets to the ending point stated. - -### Retrieve more data - -The recorder will recreate in this simulation, the exact same conditions as the original. That ensures consistent data within different playbacks. - -Gather a list of the important moments, actors and events. Add sensors whenever needed and play the simulation back. The process is exactly the same as before. The script __tutorial_replay.py__ provides different examples that have been thoroughly explained in the [__Set advanced sensors__](#set-advanced-sensors) section. Others have been explained in the section [__Set basic sensors__](#set-basic-sensors). - -Add as many sensors as needed, wherever they are needed. Play the simulation back as many times as desired and retrieve as much data as desired. - -### Change the weather - -The recording will recreate the original weather conditions. However, these can be altered at will. This may be interesting to compare how does it affect sensors, while mantaining the rest of events the same. - -Get the current weather and modify it freely. Remember that [carla.WeatherParameters](python_api.md#carla.WeatherParameters) has some presets available. The script will change the environment to a foggy sunset. - -```py -# -------------- -# Change weather for playback -# -------------- -weather = world.get_weather() -weather.sun_altitude_angle = -30 -weather.fog_density = 65 -weather.fog_distance = 10 -world.set_weather(weather) -``` - -### Try new outcomes - -The new simulation is not strictly linked to the recording. It can be modified anytime, and even when the recorder stops, the simulation goes on. - -This can be profitable for the user. For instance, collisions can be forced or avoided by playing back the simulation a few seconds before, and spawning or destroying an actor. Ending the recording at a specific moment can also be useful. Doing so, vehicles may take different paths. - -Change the conditions and mess with the simulation. There is nothing to lose, as the recorder grants that the initial simulation can always be reenacted. This is the key to exploit the full potential of CARLA. - ---- -## Tutorial scripts - -Hereunder are the two scripts gathering the fragments of code for this tutorial. Most of the code is commented, as it is meant to be modified to fit specific purposes. - -
-tutorial_ego.py - -```py -import glob -import os -import sys -import time - -try: - sys.path.append(glob.glob('../carla/dist/carla-*%d.%d-%s.egg' % ( - sys.version_info.major, - sys.version_info.minor, - 'win-amd64' if os.name == 'nt' else 'linux-x86_64'))[0]) -except IndexError: - pass - -import carla - -import argparse -import logging -import random - - -def main(): - argparser = argparse.ArgumentParser( - description=__doc__) - argparser.add_argument( - '--host', - metavar='H', - default='127.0.0.1', - help='IP of the host server (default: 127.0.0.1)') - argparser.add_argument( - '-p', '--port', - metavar='P', - default=2000, - type=int, - help='TCP port to listen to (default: 2000)') - args = argparser.parse_args() - - logging.basicConfig(format='%(levelname)s: %(message)s', level=logging.INFO) - - client = carla.Client(args.host, args.port) - client.set_timeout(10.0) - - try: - - world = client.get_world() - ego_vehicle = None - ego_cam = None - ego_col = None - ego_lane = None - ego_obs = None - ego_gnss = None - ego_imu = None - - # -------------- - # Start recording - # -------------- - """ - client.start_recorder('~/tutorial/recorder/recording01.log') - """ - - # -------------- - # Spawn ego vehicle - # -------------- - """ - ego_bp = world.get_blueprint_library().find('vehicle.tesla.model3') - ego_bp.set_attribute('role_name','ego') - print('\nEgo role_name is set') - ego_color = random.choice(ego_bp.get_attribute('color').recommended_values) - ego_bp.set_attribute('color',ego_color) - print('\nEgo color is set') - - spawn_points = world.get_map().get_spawn_points() - number_of_spawn_points = len(spawn_points) - - if 0 < number_of_spawn_points: - random.shuffle(spawn_points) - ego_transform = spawn_points[0] - ego_vehicle = world.spawn_actor(ego_bp,ego_transform) - print('\nEgo is spawned') - else: - logging.warning('Could not found any spawn points') - """ - - # -------------- - # Add a RGB camera sensor to ego vehicle. - # -------------- - """ - cam_bp = None - cam_bp = world.get_blueprint_library().find('sensor.camera.rgb') - cam_bp.set_attribute("image_size_x",str(1920)) - cam_bp.set_attribute("image_size_y",str(1080)) - cam_bp.set_attribute("fov",str(105)) - cam_location = carla.Location(2,0,1) - cam_rotation = carla.Rotation(0,180,0) - cam_transform = carla.Transform(cam_location,cam_rotation) - ego_cam = world.spawn_actor(cam_bp,cam_transform,attach_to=ego_vehicle, attachment_type=carla.AttachmentType.Rigid) - ego_cam.listen(lambda image: image.save_to_disk('~/tutorial/output/%.6d.jpg' % image.frame)) - """ - - # -------------- - # Add collision sensor to ego vehicle. - # -------------- - """ - col_bp = world.get_blueprint_library().find('sensor.other.collision') - col_location = carla.Location(0,0,0) - col_rotation = carla.Rotation(0,0,0) - col_transform = carla.Transform(col_location,col_rotation) - ego_col = world.spawn_actor(col_bp,col_transform,attach_to=ego_vehicle, attachment_type=carla.AttachmentType.Rigid) - def col_callback(colli): - print("Collision detected:\n"+str(colli)+'\n') - ego_col.listen(lambda colli: col_callback(colli)) - """ - - # -------------- - # Add Lane invasion sensor to ego vehicle. - # -------------- - """ - lane_bp = world.get_blueprint_library().find('sensor.other.lane_invasion') - lane_location = carla.Location(0,0,0) - lane_rotation = carla.Rotation(0,0,0) - lane_transform = carla.Transform(lane_location,lane_rotation) - ego_lane = world.spawn_actor(lane_bp,lane_transform,attach_to=ego_vehicle, attachment_type=carla.AttachmentType.Rigid) - def lane_callback(lane): - print("Lane invasion detected:\n"+str(lane)+'\n') - ego_lane.listen(lambda lane: lane_callback(lane)) - """ - - # -------------- - # Add Obstacle sensor to ego vehicle. - # -------------- - """ - obs_bp = world.get_blueprint_library().find('sensor.other.obstacle') - obs_bp.set_attribute("only_dynamics",str(True)) - obs_location = carla.Location(0,0,0) - obs_rotation = carla.Rotation(0,0,0) - obs_transform = carla.Transform(obs_location,obs_rotation) - ego_obs = world.spawn_actor(obs_bp,obs_transform,attach_to=ego_vehicle, attachment_type=carla.AttachmentType.Rigid) - def obs_callback(obs): - print("Obstacle detected:\n"+str(obs)+'\n') - ego_obs.listen(lambda obs: obs_callback(obs)) - """ - - # -------------- - # Add GNSS sensor to ego vehicle. - # -------------- - """ - gnss_bp = world.get_blueprint_library().find('sensor.other.gnss') - gnss_location = carla.Location(0,0,0) - gnss_rotation = carla.Rotation(0,0,0) - gnss_transform = carla.Transform(gnss_location,gnss_rotation) - gnss_bp.set_attribute("sensor_tick",str(3.0)) - ego_gnss = world.spawn_actor(gnss_bp,gnss_transform,attach_to=ego_vehicle, attachment_type=carla.AttachmentType.Rigid) - def gnss_callback(gnss): - print("GNSS measure:\n"+str(gnss)+'\n') - ego_gnss.listen(lambda gnss: gnss_callback(gnss)) - """ - - # -------------- - # Add IMU sensor to ego vehicle. - # -------------- - """ - imu_bp = world.get_blueprint_library().find('sensor.other.imu') - imu_location = carla.Location(0,0,0) - imu_rotation = carla.Rotation(0,0,0) - imu_transform = carla.Transform(imu_location,imu_rotation) - imu_bp.set_attribute("sensor_tick",str(3.0)) - ego_imu = world.spawn_actor(imu_bp,imu_transform,attach_to=ego_vehicle, attachment_type=carla.AttachmentType.Rigid) - def imu_callback(imu): - print("IMU measure:\n"+str(imu)+'\n') - ego_imu.listen(lambda imu: imu_callback(imu)) - """ - - # -------------- - # Place spectator on ego spawning - # -------------- - """ - spectator = world.get_spectator() - world_snapshot = world.wait_for_tick() - spectator.set_transform(ego_vehicle.get_transform()) - """ - - # -------------- - # Enable autopilot for ego vehicle - # -------------- - """ - ego_vehicle.set_autopilot(True) - """ - - # -------------- - # Game loop. Prevents the script from finishing. - # -------------- - while True: - world_snapshot = world.wait_for_tick() - - finally: - # -------------- - # Stop recording and destroy actors - # -------------- - client.stop_recorder() - if ego_vehicle is not None: - if ego_cam is not None: - ego_cam.stop() - ego_cam.destroy() - if ego_col is not None: - ego_col.stop() - ego_col.destroy() - if ego_lane is not None: - ego_lane.stop() - ego_lane.destroy() - if ego_obs is not None: - ego_obs.stop() - ego_obs.destroy() - if ego_gnss is not None: - ego_gnss.stop() - ego_gnss.destroy() - if ego_imu is not None: - ego_imu.stop() - ego_imu.destroy() - ego_vehicle.destroy() - -if __name__ == '__main__': - - try: - main() - except KeyboardInterrupt: - pass - finally: - print('\nDone with tutorial_ego.') - -``` -
-
-
-tutorial_replay.py - -```py -import glob -import os -import sys -import time -import math -import weakref - -try: - sys.path.append(glob.glob('../carla/dist/carla-*%d.%d-%s.egg' % ( - sys.version_info.major, - sys.version_info.minor, - 'win-amd64' if os.name == 'nt' else 'linux-x86_64'))[0]) -except IndexError: - pass - -import carla - -import argparse -import logging -import random - -def main(): - client = carla.Client('127.0.0.1', 2000) - client.set_timeout(10.0) - - try: - - world = client.get_world() - ego_vehicle = None - ego_cam = None - depth_cam = None - depth_cam02 = None - sem_cam = None - rad_ego = None - lidar_sen = None - - # -------------- - # Query the recording - # -------------- - """ - # Show the most important events in the recording. - print(client.show_recorder_file_info("~/tutorial/recorder/recording05.log",False)) - # Show actors not moving 1 meter in 10 seconds. - #print(client.show_recorder_actors_blocked("~/tutorial/recorder/recording04.log",10,1)) - # Show collisions between any type of actor. - #print(client.show_recorder_collisions("~/tutorial/recorder/recording04.log",'v','a')) - """ - - # -------------- - # Reenact a fragment of the recording - # -------------- - """ - client.replay_file("~/tutorial/recorder/recording03.log",0,30,0) - """ - - # -------------- - # Set playback simulation conditions - # -------------- - """ - ego_vehicle = world.get_actor(322) #Store the ID from the simulation or query the recording to find out - """ - - # -------------- - # Place spectator on ego spawning - # -------------- - """ - spectator = world.get_spectator() - world_snapshot = world.wait_for_tick() - spectator.set_transform(ego_vehicle.get_transform()) - """ - - # -------------- - # Change weather conditions - # -------------- - """ - weather = world.get_weather() - weather.sun_altitude_angle = -30 - weather.fog_density = 65 - weather.fog_distance = 10 - world.set_weather(weather) - """ - - # -------------- - # Add a RGB camera to ego vehicle. - # -------------- - """ - cam_bp = None - cam_bp = world.get_blueprint_library().find('sensor.camera.rgb') - cam_location = carla.Location(2,0,1) - cam_rotation = carla.Rotation(0,180,0) - cam_transform = carla.Transform(cam_location,cam_rotation) - cam_bp.set_attribute("image_size_x",str(1920)) - cam_bp.set_attribute("image_size_y",str(1080)) - cam_bp.set_attribute("fov",str(105)) - ego_cam = world.spawn_actor(cam_bp,cam_transform,attach_to=ego_vehicle, attachment_type=carla.AttachmentType.Rigid) - ego_cam.listen(lambda image: image.save_to_disk('~/tutorial/new_rgb_output/%.6d.jpg' % image.frame)) - """ - - # -------------- - # Add a Logarithmic Depth camera to ego vehicle. - # -------------- - """ - depth_cam = None - depth_bp = world.get_blueprint_library().find('sensor.camera.depth') - depth_bp.set_attribute("image_size_x",str(1920)) - depth_bp.set_attribute("image_size_y",str(1080)) - depth_bp.set_attribute("fov",str(105)) - depth_location = carla.Location(2,0,1) - depth_rotation = carla.Rotation(0,180,0) - depth_transform = carla.Transform(depth_location,depth_rotation) - depth_cam = world.spawn_actor(depth_bp,depth_transform,attach_to=ego_vehicle, attachment_type=carla.AttachmentType.Rigid) - # This time, a color converter is applied to the image, to get the semantic segmentation view - depth_cam.listen(lambda image: image.save_to_disk('~/tutorial/de_log/%.6d.jpg' % image.frame,carla.ColorConverter.LogarithmicDepth)) - """ - # -------------- - # Add a Depth camera to ego vehicle. - # -------------- - """ - depth_cam02 = None - depth_bp02 = world.get_blueprint_library().find('sensor.camera.depth') - depth_bp02.set_attribute("image_size_x",str(1920)) - depth_bp02.set_attribute("image_size_y",str(1080)) - depth_bp02.set_attribute("fov",str(105)) - depth_location02 = carla.Location(2,0,1) - depth_rotation02 = carla.Rotation(0,180,0) - depth_transform02 = carla.Transform(depth_location02,depth_rotation02) - depth_cam02 = world.spawn_actor(depth_bp02,depth_transform02,attach_to=ego_vehicle, attachment_type=carla.AttachmentType.Rigid) - # This time, a color converter is applied to the image, to get the semantic segmentation view - depth_cam02.listen(lambda image: image.save_to_disk('~/tutorial/de/%.6d.jpg' % image.frame,carla.ColorConverter.Depth)) - """ - - # -------------- - # Add a new semantic segmentation camera to ego vehicle - # -------------- - """ - sem_cam = None - sem_bp = world.get_blueprint_library().find('sensor.camera.semantic_segmentation') - sem_bp.set_attribute("image_size_x",str(1920)) - sem_bp.set_attribute("image_size_y",str(1080)) - sem_bp.set_attribute("fov",str(105)) - sem_location = carla.Location(2,0,1) - sem_rotation = carla.Rotation(0,180,0) - sem_transform = carla.Transform(sem_location,sem_rotation) - sem_cam = world.spawn_actor(sem_bp,sem_transform,attach_to=ego_vehicle, attachment_type=carla.AttachmentType.Rigid) - # This time, a color converter is applied to the image, to get the semantic segmentation view - sem_cam.listen(lambda image: image.save_to_disk('~/tutorial/new_sem_output/%.6d.jpg' % image.frame,carla.ColorConverter.CityScapesPalette)) - """ - - # -------------- - # Add a new radar sensor to ego vehicle - # -------------- - """ - rad_cam = None - rad_bp = world.get_blueprint_library().find('sensor.other.radar') - rad_bp.set_attribute('horizontal_fov', str(35)) - rad_bp.set_attribute('vertical_fov', str(20)) - rad_bp.set_attribute('range', str(20)) - rad_location = carla.Location(x=2.8, z=1.0) - rad_rotation = carla.Rotation(pitch=5) - rad_transform = carla.Transform(rad_location,rad_rotation) - rad_ego = world.spawn_actor(rad_bp,rad_transform,attach_to=ego_vehicle, attachment_type=carla.AttachmentType.Rigid) - def rad_callback(radar_data): - velocity_range = 7.5 # m/s - current_rot = radar_data.transform.rotation - for detect in radar_data: - azi = math.degrees(detect.azimuth) - alt = math.degrees(detect.altitude) - # The 0.25 adjusts a bit the distance so the dots can - # be properly seen - fw_vec = carla.Vector3D(x=detect.depth - 0.25) - carla.Transform( - carla.Location(), - carla.Rotation( - pitch=current_rot.pitch + alt, - yaw=current_rot.yaw + azi, - roll=current_rot.roll)).transform(fw_vec) - - def clamp(min_v, max_v, value): - return max(min_v, min(value, max_v)) - - norm_velocity = detect.velocity / velocity_range # range [-1, 1] - r = int(clamp(0.0, 1.0, 1.0 - norm_velocity) * 255.0) - g = int(clamp(0.0, 1.0, 1.0 - abs(norm_velocity)) * 255.0) - b = int(abs(clamp(- 1.0, 0.0, - 1.0 - norm_velocity)) * 255.0) - world.debug.draw_point( - radar_data.transform.location + fw_vec, - size=0.075, - life_time=0.06, - persistent_lines=False, - color=carla.Color(r, g, b)) - rad_ego.listen(lambda radar_data: rad_callback(radar_data)) - """ - - # -------------- - # Add a new LIDAR sensor to ego vehicle - # -------------- - """ - lidar_cam = None - lidar_bp = world.get_blueprint_library().find('sensor.lidar.ray_cast') - lidar_bp.set_attribute('channels',str(32)) - lidar_bp.set_attribute('points_per_second',str(90000)) - lidar_bp.set_attribute('rotation_frequency',str(40)) - lidar_bp.set_attribute('range',str(20)) - lidar_location = carla.Location(0,0,2) - lidar_rotation = carla.Rotation(0,0,0) - lidar_transform = carla.Transform(lidar_location,lidar_rotation) - lidar_sen = world.spawn_actor(lidar_bp,lidar_transform,attach_to=ego_vehicle,attachment_type=carla.AttachmentType.Rigid) - lidar_sen.listen(lambda point_cloud: point_cloud.save_to_disk('/home/adas/Desktop/tutorial/new_lidar_output/%.6d.ply' % point_cloud.frame)) - """ - - # -------------- - # Game loop. Prevents the script from finishing. - # -------------- - while True: - world_snapshot = world.wait_for_tick() - - finally: - # -------------- - # Destroy actors - # -------------- - if ego_vehicle is not None: - if ego_cam is not None: - ego_cam.stop() - ego_cam.destroy() - if depth_cam is not None: - depth_cam.stop() - depth_cam.destroy() - if sem_cam is not None: - sem_cam.stop() - sem_cam.destroy() - if rad_ego is not None: - rad_ego.stop() - rad_ego.destroy() - if lidar_sen is not None: - lidar_sen.stop() - lidar_sen.destroy() - ego_vehicle.destroy() - print('\nNothing to be done.') - - -if __name__ == '__main__': - - try: - main() - except KeyboardInterrupt: - pass - finally: - print('\nDone with tutorial_replay.') -``` -
-
- ---- -That is a wrap on how to properly retrieve data from the simulation. Make sure to play around, change the conditions of the simulator, experiment with sensor settings. The possibilities are endless. - - -Visit the forum to post any doubts or suggestions that have come to mind during this reading. - -
- -
diff --git a/Docs/tutorials.md b/Docs/tutorials.md index c09204d82..f0a556428 100644 --- a/Docs/tutorials.md +++ b/Docs/tutorials.md @@ -6,26 +6,18 @@ Here you will find the multitude of tutorials available to help you understand h ### CARLA features -[__Retrieve simulation data__](tuto_G_retrieve_data.md) — A step by step guide to properly gather data using the recorder. [__Traffic manager__](tuto_G_traffic_manager.md) — How to use traffic manager to guide traffic around your town. [__Texture streaming__](tuto_G_texture_streaming.md) — Modify textures of map objects in real time to add variation. [__Instance segmentation camera__](tuto_G_instance_segmentation_sensor.md) — Use an instance segmentation camera to distinguish objects of the same class. [__Bounding boxes__](tuto_G_bounding_boxes.md) — Project bounding boxes from CARLA objects into the camera. [__Pedestrian bones__](tuto_G_pedestrian_bones.md) — Project pedestrian skeleton into camera plane. [__Control walker skeletons__](tuto_G_control_walker_skeletons.md) — Animate walkers using skeletons. - -### Building and integration - -[__Build Unreal Engine and CARLA in Docker__](build_docker_unreal.md) — Build Unreal Engine and CARLA in Docker. -[__CarSim Integration__](tuto_G_carsim_integration.md) — Tutorial on how to run a simulation using the CarSim vehicle dynamics engine. -[__RLlib Integration__](tuto_G_rllib_integration.md) — Find out how to run your own experiment using the RLlib library. -[__Chrono Integration__](tuto_G_chrono.md) — Use the Chrono integration to simulation physics. [__PyGame control__](tuto_G_pygame.md) — Use PyGame to display the output of camera sensors. ## Assets and maps [__Generate maps with OpenStreetMap__](tuto_G_openstreetmap.md) — Use OpenStreetMap to generate maps for use in simulations. -[__Add a new vehicle__](tuto_A_add_vehicle.md) — Prepare a vehicle to be used in CARLA. +[__Add a new vehicle__](tuto_content_authoring_vehicles.md) — Prepare a vehicle to be used in CARLA. [__Add new props__](tuto_A_add_props.md) — Import additional props into CARLA. [__Create standalone packages__](tuto_A_create_standalone.md) — Generate and handle standalone packages for assets. [__Material customization__](tuto_A_material_customization.md) — Edit vehicle and building materials. diff --git a/mkdocs.yml b/mkdocs.yml index d1149c11f..b9ca3a7a3 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -52,11 +52,5 @@ markdown_extensions: theme: name: 'readthedocs' - logo: img/logos/carla_ue5_logo.png - palette: - # Dark mode - - scheme: slate - primary: blue - accent: blue - toggle: + logo: carla_ue5_logo.png