2022.03.19 17:46 slipperysoup My guide on how to figure out what Latin American Country you are in
2022.03.08 10:29 HDGTurkey LibGDX Demo Game Application With ML Kit Hand Gesture Detection and Ashley System Library Part 1
![]() | https://preview.redd.it/vl9h7ph5p4m81.jpg?width=1400&format=pjpg&auto=webp&s=6629feb6c63f7fa3a0233b5beb041b552b484936 submitted by HDGTurkey to u/HDGTurkey [link] [comments] IntroductionIn this demo application, we will build LibGDXDemo Game Application with ML Kit Hand Gesture Detection and Ashley Entity System Library. If you didn’t know anything about LibGDX you can read my first article about LibGDX at this link. Firstly, I will explain the LibGDX Ashley Entity System Library and how to use and implement this Library in LibGDX. After that, I will explain the ML Kit Hand Gesture Detection and how to implement it. Finally, we will create a custom camera view to use hand gesture detection while playing the game.Integrating Applications to HMS CoreTo start developing an app with Huawei mobile services, you need to integrate your application to the HMS core. Check the link below to integrate your application, also don’t forget to enable the ML Kit from AppGallery Connect.https://medium.com/huawei-developers/android-integrating-your-apps-with-huawei-hms-core-1f1e2a090e98 Ashley Entity System LibraryAshley Entity System is an Entity System library that’s managed under the LibGDX organization and is well-suited for game development. It depends on LibGDX utility classes. Entity systems provide a different way to manage data and functionality towards large sets of objects without having to make the object classes rich with inheritance. Utilizing Ashley might be a helpful approach for those looking for an object modeling approach like Unity provides, but with the scope of a framework instead of the game engine.Ashley Library is formed by the combination of Entity, System, Component, and Engine. Entity: Entities are game object it exists in our game world used with a list of components Component: Components are game data, it is used with entities and systems System: Systems are game logic used with the Family to use specific entities. There are three systems in Ashley Library. These are Interval System, Entity System, Iterating System. Family: Families are groups of components. it defines which components should be used in a specific System. Systems only work with those components Engine: The engine class is the core class and center of our Ashley Library. We can add systems and entities using the Engine Class. ML Kit Hand Gesture DetectionThis service provides two capabilities: hand keypoint detection and hand gesture recognition. The hand keypoint detection capability can detect 21 hand keypoints (including fingertips, knuckles, and wrists) and return positions of the key points. The hand gesture recognition capability can detect and return the positions of all rectangle areas of the hand from images and videos, and the type and confidence of a gesture. This capability can recognize 14 gestures, including the thumbs-up/down, OK sign, fist, finger heart, and number gestures from 1 to 9. Both capabilities support detection from static images and real-time camera streams. I use hand gesture detection and choose the number one sign to move the player in this project.Assigning Permissions in the Manifest FileThe ML Kit SDK Hand Gesture Services requires some permissions. We should declare the permissions in the AndroidManifest.xml file as follows:
Preparations for the CodeAfter adding permission to AndroidManifest.xml we need to add dependency of ML Kit Hand Gesture Service to Project/build.gradle file.We should define our android specific services under Project/ build.gradle in the android header for Libgdx Application. project(":android") { apply plugin: "android" apply plugin: "kotlin-android" apply plugin: 'com.huawei.agconnect' apply plugin: 'com.android.application' //dagger hilt apply plugin: 'kotlin-kapt' configurations { natives } configurations { natives } dependencies { implementation project(":core") api "com.badlogicgames.gdx:gdx-backend-android:$gdxVersion" annotationProcessor "com.squareup.dagger:dagger-compiler:1.2.2" natives "com.badlogicgames.gdx:gdx-platform:$gdxVersion:natives-armeabi-v7a" natives "com.badlogicgames.gdx:gdx-platform:$gdxVersion:natives-arm64-v8a" natives "com.badlogicgames.gdx:gdx-platform:$gdxVersion:natives-x86" natives "com.badlogicgames.gdx:gdx-platform:$gdxVersion:natives-x86_64" api "com.badlogicgames.gdx:gdx-box2d:$gdxVersion" natives "com.badlogicgames.gdx:gdx-box2d-platform:$gdxVersion:natives-armeabi-v7a" natives "com.badlogicgames.gdx:gdx-box2d-platform:$gdxVersion:natives-arm64-v8a" natives "com.badlogicgames.gdx:gdx-box2d-platform:$gdxVersion:natives-x86" natives "com.badlogicgames.gdx:gdx-box2d-platform:$gdxVersion:natives-x86_64" api "com.badlogicgames.ashley:ashley:$ashleyVersion" api "com.badlogicgames.gdx-controllers:gdx-controllers-android:$gdxControllersVersion" api "org.jetbrains.kotlin:kotlin-stdlib:$kotlinVersion" api 'com.huawei.agconnect:agconnect-core:1.4.1.300' api 'com.huawei.hms:hianalytics:6.2.0.301' api 'com.huawei.hms:hwid:6.1.0.302' api 'com.huawei.hms:game:5.0.4.303' //ML Kit Hand Gesture Service implementation 'com.huawei.hms:ml-computer-vision-handkeypoint:3.2.0.300' // Import the hand keypoint detection model package. implementation 'com.huawei.hms:ml-computer-vision-handkeypoint-model:3.2.0.300' // Import the hand gesture recognition model package. implementation 'com.huawei.hms:ml-computer-vision-gesture-model:3.2.0.300'We can define the ML Kit Hand Gesture Service dependency in the project(“: android”) dependencies. Lens Engine Preview ClassWe use this preview class to control the Lens Engine and Graphic Overlay. We can use start and stop functions to start or stop the Lens Engine.Lens Engine: Lens Engine a class with the camera initialization, frame obtaining, and logic control functions encapsulated. class LensEnginePreview(private val mContext: Context) : ViewGroup(mContext) { private val mSurfaceView: SurfaceView private var mStartRequested = false private var mSurfaceAvailable = false private var mLensEngine: LensEngine? = null private var mOverlay: GraphicOverlay? = null @Throws(IOException::class) fun start(lensEngine: LensEngine?) { if (lensEngine == null) { stop() } mLensEngine = lensEngine if (mLensEngine != null) { mStartRequested = true startIfReady() } } @Throws(IOException::class) fun start(lensEngine: LensEngine?, overlay: GraphicOverlay?) { mOverlay = overlay this.start(lensEngine) } fun stop() { if (mLensEngine != null) { mLensEngine!!.close() } } fun release() { if (mLensEngine != null) { mLensEngine!!.release() mLensEngine = null } } @Throws(IOException::class) private fun startIfReady() { if (mStartRequested && mSurfaceAvailable) { mLensEngine!!.run(mSurfaceView.holder) if (mOverlay != null) { val size = mLensEngine!!.displayDimension val min = Math.min(size.width, size.height) val max = Math.max(size.width, size.height) if (isPortraitMode) { mOverlay!!.setCameraInfo(min, max, mLensEngine!!.lensType) } else { mOverlay!!.setCameraInfo(max, min, mLensEngine!!.lensType) } mOverlay!!.clear() } mStartRequested = false } } private inner class SurfaceCallback : SurfaceHolder.Callback { override fun surfaceCreated(surface: SurfaceHolder) { mSurfaceAvailable = true try { startIfReady() } catch (e: IOException) { Log.e(TAG, "Could not start camera source.", e) } } override fun surfaceDestroyed(surface: SurfaceHolder) { mSurfaceAvailable = false } override fun surfaceChanged(holder: SurfaceHolder, format: Int, width: Int, height: Int) {} } override fun onLayout(changed: Boolean, left: Int, top: Int, right: Int, bottom: Int) { var previewWidth = 480 var previewHeight = 360 if (mLensEngine != null) { val size = mLensEngine!!.displayDimension if (size != null) { previewWidth = size.width previewHeight = size.height } } if (isPortraitMode) { val tmp = previewWidth previewWidth = previewHeight previewHeight = tmp } val viewWidth = right - left val viewHeight = bottom - top val childWidth: Int val childHeight: Int var childXOffset = 0 var childYOffset = 0 val widthRatio = viewWidth.toFloat() / previewWidth.toFloat() val heightRatio = viewHeight.toFloat() / previewHeight.toFloat() if (widthRatio > heightRatio) { childWidth = viewWidth childHeight = (previewHeight.toFloat() * widthRatio).toInt() childYOffset = (childHeight - viewHeight) / 2 } else { childWidth = (previewWidth.toFloat() * heightRatio).toInt() childHeight = viewHeight childXOffset = (childWidth - viewWidth) / 2 } for (i in 0 until this.childCount) { getChildAt(i).layout(-1 * childXOffset, -1 * childYOffset, childWidth - childXOffset, childHeight - childYOffset) } try { startIfReady() } catch (e: IOException) { Log.e(TAG, "Could not start camera source.", e) } } private val isPortraitMode: Boolean private get() { val orientation = mContext.resources.configuration.orientation if (orientation == Configuration.ORIENTATION_LANDSCAPE) { return false } if (orientation == Configuration.ORIENTATION_PORTRAIT) { return true } Log.d(TAG, "isPortraitMode returning false by default") return false } companion object { private val TAG = LensEnginePreview::class.java.simpleName } init { mSurfaceView = SurfaceView(mContext) mSurfaceView.holder.addCallback(SurfaceCallback()) this.addView(mSurfaceView) } } Graphic Overlay ClassI use this class to render a series of custom graphics to be overlayed on top of an associated preview (i.e., the camera preview). The creator can add graphics objects, update the objects, and remove them, triggering the appropriate drawing and invalidation within the view. Supports scaling and mirroring of the graphics relative to the camera’s preview properties. The idea is that detection items are expressed in terms of a preview size but need to be scaled up to the full view size, and also mirrored in the case of the front-facing camera.class GraphicOverlay(context: Context?) : View(context) { private val mLock = Any() private var mPreviewWidth = 0 private var mWidthScaleFactor = 1.0f private var mPreviewHeight = 0 private var mHeightScaleFactor = 1.0f private var mFacing = LensEngine.BACK_LENS private val mGraphics: MutableSetAssociated [Graphic] items should use the following methods to convert to view coordinates for the graphics that are drawn: [Graphic.scaleX] and [Graphic.scaleY] adjust the size of the supplied value from the preview scale to the view scale. [Graphic.translateX] and [Graphic.translateY] adjust the coordinate from the preview’s coordinate system to the view coordinate system. Custom Camera View ClassI create this class by using the device camera to recognize hand gestures and movement rectangles on the game screen.object CustomCameraView { fun initView (context: Context,gameView:View,mPreview:LensEnginePreview,mOverlay:GraphicOverlay):View { val mainLayout= RelativeLayout(context) mainLayout.id=R.id.adLensId val lensLayout= RelativeLayout(context) val overlayParams= RelativeLayout.LayoutParams(ViewGroup.LayoutParams.WRAP_CONTENT, ViewGroup.LayoutParams.WRAP_CONTENT) val previewParams= RelativeLayout.LayoutParams(200,140) val lensParams= RelativeLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT,140) overlayParams.addRule(RelativeLayout.ALIGN_PARENT_TOP, RelativeLayout.CENTER_IN_PARENT) previewParams.addRule(RelativeLayout.ALIGN_PARENT_TOP, RelativeLayout.CENTER_IN_PARENT) lensLayout.addView(mPreview,previewParams) lensLayout.addView(mOverlay,overlayParams) val gameParams= RelativeLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ActionBar.LayoutParams.MATCH_PARENT) gameParams.addRule(RelativeLayout.BELOW,mainLayout.id) mainLayout.addView(gameView,gameParams) mainLayout.addView(lensLayout,lensParams) return mainLayout } }I create two dynamic Relative Layouts to show the camera on the game screen. For creating the dynamic layouts we should define Layout params and Rules. I use Layout params for defining the width and height of the layouts. Also, I use rules for defining the location of the relative layouts on the screen. Hand Analyzer TransactorI create the HandAnalyzerTransactor class for processing recognition results. This class implements the MLTransactorclass HandAnalyzerTransactor(private val mGraphicOverlay: GraphicOverlay) : MLTransactor(in line 12) I call HandGestureGraphic class with mGraphicOverlay parameter and list parameter to get the real coordinate of the result points. Hand Gesture GraphicI use this class for getting results points and adjusting coordinates from the preview’s coordinates system to the view coordinates system.class HandGestureGraphic(overlay: GraphicOverlay?, private val results: MutableList(In lines 5–6) I get the result lists every member and translate this rectangle to real view coordinates with the help of translateRect method. (In lines 8–13). I get the mlGesture.category and use the rect.centerX() method to get hand-moving coordinates and give that coordinates to my constant value to move the player. ML Kit ClassWe should create that class under the android folder.class MLKit(private val context: Context) { private var tag: String = "MLKit" private var mPreview: LensEnginePreview? = null private var mOverlay: GraphicOverlay? = null private var mAnalyzer: MLGestureAnalyzer? = null private var mLensEngine: LensEngine? = null private var mLensType = LensEngine.BACK_LENS // Create Hand analyzer fun createHandAnalyzer() { val setting = MLGestureAnalyzerSetting.Factory() .create() mAnalyzer = MLGestureAnalyzerFactory.getInstance().getGestureAnalyzer(setting) mAnalyzer!!.setTransactor(HandAnalyzerTransactor(mOverlay!!)) } // Initialize the custom view fun initView(gameView: View): View { return CustomCameraView.initView(context, gameView, mPreview!!, mOverlay!!) } //initialize LensEnginePreview and Graphic Overlay fun initPreviewAndOverlay(context: Context) { mPreview = LensEnginePreview(context) mOverlay = GraphicOverlay(context) } // Create LensEngine. fun createLensEngine() { mLensEngine = LensEngine.Creator(context, mAnalyzer) .setLensType(mLensType) .applyDisplayDimension(640, 480) .applyFps(25.0f) .enableAutomaticFocus(true) .create() } // Start the lens engine with preview start fun startLensEngine() { if (mLensEngine != null) { try { mPreview!!.start(mLensEngine, mOverlay) } catch (e: IOException) { Log.e(tag, "Failed to start lens engine.", e) mLensEngine!!.release() mLensEngine = null } } } fun previewStop() { mPreview!!.stop() } fun destroyLensEngineAndAnalyzer() { if (mLensEngine != null) { mLensEngine!!.release() } if (mAnalyzer != null) { mAnalyzer!!.stop() } } }(in lines 16–20) I create a hand gesture recognition analyzer using a gesture analyzer setting. Also, I set the transactor with help of the analyzer setTransactor method. (in lines 24–26) I initialize the custom camera view. (In lines 29–32) I initialize the Lens engine preview and graphic overlay. (in line 35–41) I create Lens Engine with the help of Lens Engine creator. (in lines 45–55) I control the lens engine if it is not null then I start the lens engine preview with lens engine and graphic overlay parameter. (in lines 61–68) I destroy the lens engine and ML Gesture Analyzer. Kit Module Object@Module @InstallIn(ActivityComponent::class) object KitModule { // Account Kit scope @ActivityScoped @Provides fun accountKitProvider(@ApplicationContext context: Context): AccountKit { return AccountKit(context) } // ML kit scope @ActivityScoped @Provides fun mlKitProvider(@ApplicationContext context: Context): MLKit { return MLKit(context) } }We create that module object for the Dagger Hilt dependency injection using our MLKit class. Android Launcher ClassI use this class to check camera permission and trigger the ml kit functions. This class is the main class of the Android for playing the game on the android device.class AndroidLauncher : AndroidApplication(), KitInterface { private var isPermissionRequested = false private val CAMERA_PERMISSION_CODE = 0 private var TAG: String = "AndroidLauncherXxxx" private var isUserLoggedIn = false @Inject lateinit var accountKit: AccountKit @Inject lateinit var mlKit: MLKit override fun onCreate(savedInstanceState: Bundle?) { super.onCreate(savedInstanceState) accountKit = AccountKit(this) mlKit = MLKit(this) //init ml kit preview and overlay mlKit.initPreviewAndOverlay(this) //ml kit create analyzer mlKit.createHandAnalyzer() // Checking Camera Permissions, create and start Lens Engine if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED) { mlKit.createLensEngine() mlKit.startLensEngine() } else { checkPermission() } } // Init view private fun initView() { val config = AndroidApplicationConfiguration() val gameView = initializeForView(DarkSpaceGame(this), config) setContentView(mlKit.initView(gameView)) } // get permission private fun getAllPermission(): List(In line 15) I use the injection for the ML Kit class with u/Inject annotation. (In line 23) I use the MLKit class initPreviewAndOverlay method to initialize the LensEnginePreview and GraphicOverlay. (In lines 28–34) Firstly, I check the camera permission after that I create and start the LensEngine with the help of MLKit class. (In lines 37–41) I create the game view using the initializeForView method. After that, I use MLKit class to initialize the custom view with a parameter of the game view. (in lines 133–136) I stop the LensEnginePreview in the onPause method and (in line 140) I release the Lens engine and stop the analyzer in the onDestroy method. ConclusionNow we learned how to implement and use the ML Kit Hand Gesture Services with the LibGDX application. Also, we learned how to create a custom camera view with the LibGDX game. If you want to learn more about LibGDX and its services, you can check this link. We will continue the LibGDX demo application in the next part of this article by creating our Libgdx Game screen and understanding LibGDX Ashley Entity System Library. Also, we will create our player object with entity factory class, we will learn and use components, systems, engines in the LibGDX game demo application and etc. https://i.redd.it/sofvnltwo4m81.gif Take Care until next time … Referenceshttps://libgdx.com/dev/https://github.com/libgdx/ashley https://github.com/HMS-MLKit/HUAWEI-HMS-MLKit-Sample/tree/masteMLKit-Sample https://developer.huawei.com/consumeen/doc/development/hiai-Guides/service-introduction-0000001050040017 |
2021.02.12 17:40 ManSore Need a recommendation. Moving from my MM1000
2020.06.06 19:42 DenjeRL Reposting here as MPreview is quite small sub so i know not everyone have tested everything. If you have any input , i'd be highly appreciated. Thank you.
submitted by DenjeRL to MouseReview [link] [comments]
2015.09.10 04:08 North_Korean_Spy_ Shillplus user attempts to port M'Preview 1 to Onepleb, DuARTe smites his filthy hard drive.
submitted by North_Korean_Spy_ to androidcirclejerk [link] [comments]