Tuesday, June 4, 2019

Adaptive User Interface Framework: Android Mobile Platform

accommodative User Interface Framework Android Mobile PlatformMr. Tulip pikaAbstractAdapting a graphical interface (GUI) to a range of resources with completely protestent capabilities is exciting topic of mobile computer. The user interface created for an lotion ought to pass its layout and parts to the user need and changes for every user. We propose a role model for mobile applications to form the user interfaces changeable for user. This identifies an appropriate expertise level to a user by learning his/her history of interaction. Dynamic App Shortcut is to be yieldd on Mobile Devices serving to user to swipe the screen umpteen multiplication to search out the required app. The prediction model utilizes multiple options together with recency, frequency, duration, time distribution and app sequence launch.KeywordsHCI in Mobile AI and expert dodges adaptational user interface modeling k-means algorithm dynamic shortcuts mobile app usage personalization.)I. Introduction An adaptive user interface (also known as AUI) is a user interface (UI) which adapts, that is changes, its layout and elements to the needs of the user or context and is similarly alterable by each user. These mutually reciprocal qualities of both adapting and being adaptable are, in a true AUI, also innate to elements that comprise the interfaces components portions of the interface might adapt to and affect other portions of the interface. The user adaptation is oft a negotiated process, as an adaptive user interfaces designers ignore where user interface components ought to go while affording a means by which both the designers and the user can determine their placement, often (though not always) in a semi-automated, if not fully automated manner. An AUI is primarily created undercoat on the features of the system, and the knowledge levels of the users that allow utilize it.Figure 1 accommodative Graphical User InterfaceThe advantages of an adaptive user interface are found within its ability to conform to a users needs. The properties of an AUI get showing only relevant information based on the current user. This creates less confusion for less experienced users and provides ease of find throughout a system. Depending on the task, we can increase the stability of a system. An adaptive user interface can be implemented in various ways. These implementations can differ between the amount of information available to authentic users, or how users utilize the application.Adaptive presentation The goal behind adaptive presentation is to display certain information based on the current user. This may mean that users with only basic knowledge of a system provide only be shown minimal information. Conversely, a user with advanced knowledge will have access to more detailed information and capabilities. A way that the AUI can achieve this speciality could be to hide information to be presented based on the users experience level. Another possibility is to control the amount of links to relevant sources on the page.Adaptive navigation Adaptive navigation intends to guide a user to their specific goal within the system by altering the way the system is navigated based on certain factors of the user. These factors can include the users expertise level with the system/subject, the current goal within the system, and other relevant factors. Examples of adaptive navigation can be achieved in many ways, similar to adaptive presentation. These can include examples such as providing links to help achieve a users specific goal, giving reference on a page to where a user is, or altering the resources available to the user.II. MOTIVATIONIn the last few years, an ecosystem of thingamabobs and heterogeneous services has emerged with a huge assortment of capacities and characteristics. These new devices, along with applications and services, must be used to enhance the quality of life, reservation the users daily activities easier, as well as in creasing their personal autonomy.User interfaces in mobile applications are complex since they need to provide sufficient features to variety of users in a restricted space where a small sum up of components are available. When user acquires expertise in the system they expect user interfaces which live up to their unique needs. Therefore, user interfaces in mobile applications should be adapted to different users. Since this problem exists in various applications a general solution is required to nark user interfaces adaptive using user context history.Figure 2 Different Mobile DevicesIn this sense, there is a clear need for creating interfaces that adapt themselves taking into peak characteristics of the user, context, application and device. One of the aspects to consider when adapting interfaces is the set of preferences of the user. When using different applications or devices, each user has different preferences, mainly related to their limitations.III. conundrum Statemen tUsing mobile and its application is a personalized experience. Each user has different preferences, mainly related to their limitations. Hence it is quite essential to account characteristics of the user, context, application and device while designing a Graphical User Interface for mobile platform. It is quite difficult to manage when there are many applications (apps) installed on a mobile device, the bare(a) task of launching an app could become inconvenient, as the user may need to swipe the screen several generation to and the desired app. Hence an adaptive user interface solution for mobile devices, which uses dynamic shortcuts to facilitate app launching is needed. In this context, personalization of applications, i.e. applications that adapt themselves to users capacities and limitations is essential.IV. Problem ModelingA. Overview or else than providing adaptive user interfaces for a specific mobile application, it is more valuable if it would be a common solution to mak e any UI adaptive. So it is support to provide a framework which can give a common solution which can be used by all developers to create applications which provide adaptive user interfaces. This framework provides Adaptive User Interfaces based on users experience level. The experience levels are classified by Inference Engine which is explained in the subsection Inference Engine. The system will learn the user experience level based on user actions performed on each component of the application with the algorithm.Figure 2 Concept of Adaptive User InterfaceFigure 3 Module diagram for the system with Adaptive User InterfaceB. Components of resolutionThe proposed adaptive user interface is mainly focused on hiding group of unwanted components for corresponding experience level of user on that application.The framework consists of three main phases such as1. selective information preprocessing step2. erudition step3. Execution and description step entropy preprocessing step1. Loca tion Data One of the factors to adapt the UI is the location of the user. This is based on the premise that the grapheme of applications a user is expected to access when at home is different from the type of applications accessed when the user is at work. The location is determined by means of the GPS sensor on the mobile device.2. Device Data Output of other sensors on the device including the ambient light sensor (to infer whether the user is indoors or outdoors), accelerometer and gyroscope (to say if the user is stationary or moving) can also be used to derive additional contextual information in order to better predict the users chosen application and modify the UI appropriately.3. App usage Data Logs of the olden application usage, the frequency at which the particular app was accessed and the user actions and interactions while using the app can act as another source of contextual information.4. Time Data The type of applications accessed on week twenty-four hourss might b e different from the applications accessed on a weekend or on holidays. Similarly, in the morning the user may access different apps than the ones they do at night. A logging service running in the device would have to log the types of apps accessed at specific times of day or day or the week, and use it to make the appropriate UI modifications.C. Learning stepThe main purpose of inference engine is to collect the data provided by the data-preprocessing module and provide an experience level of the user according to the current user context. To infer the experience level of the user, the inference engine should behave as an intelligent system which should be trained by data related to user experience level and user interactions between the applications.Figure 4 A High level architecture of adaptive user interface frameworkExecution and transformation step K-means clustering engine is capable of setting the number of clusters needed. When the number of clusters is set, the engine ca n cluster the dataset when the squared error becomes minimized. This will give each clusters sum of money points as output. Once the cluster centers are found these cluster centers will be delivered to user type selector switch. User type selector will appoint each experience level to each center sent by K-means clustering engine. Currently we have manually appointed the experience level for identified centers using natural knowledge. As mentioned earlier who know the system can suggest these levels for each cluster values. Current user context data will be feed into the User type selector and user type selector will infer a suitable experience level which is closest. This final output will be delivered to the execution and rendering step.D. ImplementationIn order to practically show the behavior of the framework a proof of concept (POC) application will be developed. A simple application which can be used as an online ticket reservation system for aircrafts will be developed as the application. This application was developed in Hyper Text Markup Language (HTML) and JavaScript. Android platform has given enough features and Application Programming Interfaces (API) to create an Android application using HTML and JavaScript.Android blade application can be created by converting a HTML page to an Android web application using WebView class. Currently there are many third party frameworks and plug-ins are available to convert HTML and JavaScript pages to Android application.This application will be connected to adaptive UI framework using a component called UIhooks. UIhooks are some methods which can be used by the developer during the application development. For example these methods can be used when some events are fire on UI components. When UIhooks are called they are developed in a way to measure the user actions performed on corresponding UI component and store them. For example when a UIhook method is called on a button on click event, the Uihooks is implemented to measure how many time the button was clicked and what is the recent time it was used. If UIhook method is called on a textbox on submit event the Uihook can inspect and store the value submitted and the count of submit action performed. This application is sent to a user study to collect training data. This is elaborated more in User study section. The collected data were organized and feed to inference engine as the training dataset. Inference engine learned the data as elaborated before and gave the suitable experience level. Once the experience level is feed to the rendering engine it finds the related rendering logics inside the UI clusters. For example if the experience level is provided as intermediate it checks for the corresponding rendering logics and UI clusters. If it is said as If user character is intermediate render cluster2 it will build a new UI using what is mentioned in cluster2. Then it renders it to the user. When the user is provided with new ada ptive UI a question will be provided to the user asking whether they are satisfied with the new UI or they want to go back to the earlier stage. This is to measure their satisfactory level and the truth of the algorithm predictions.V. Mathematical ModelingLet s (be a main set of) SDB, LDB, C, A, S, MR, AOwhere,SDB is the copy of the emcee database. This database is responsible for storing user information related to confuse interactions.LDB is a set of local database that a user owns. It consists of data tables having data items related to the products and their sales transactions.C is a set of all clients using the legion database and mining services from the server. And (c1 , c2 , c3, cn) C.A is a set of algorithms applied on the infix data to get mining results.S is the server component of the system. The server is responsible for registering, authenticating and providing associations to the end user.MR is a set of mining rules that are applied on the input dataset provide d by the client from his LDB. And (mr1 , mr2 , mr3, mrn) MRAO is a set of associations that are extracted from the input and a form the output of the system.Functionalities SDB = RegisterUser(uid, password, fullname, address, country, contact, email)password = SHA1(input_password)U = AuthenticateUser(uid, password, SDB)LDB1 = ManageProducts(pid, product name, cost)LDB2 = ManageBilling(transactions, items)LDB = LDB1 + LDB2ED(Encoded data) = EncodeTransactions(LDB2, EncodingAlgorithm(EA))UPLOAD(ED)AO = Apply Mining(ED)Results = Decode(Download(AO))VI. Results ExpectedFigure 5 Dynamic ShortcutsFigure 6 Adaptive UIVII. endThe aim of our study was to propose a high level architecture for a framework to provide adaptive user interface for mobile applications. This framework includes data preprocessing step, learning step and execution and rendering step to deliver asuitable user interface. The learning is done by an intelligent system which is unsupervised and trained using user context data. This delivers k number of experience levels by clustering the collected data set using K-means and ANN algorithm. It will also allow dynamic shortcuts to facilitate app launching. Some other options to enhance the proposed dynamic shortcuts solution such as gesture based control will also be explored in the future.VIII. ReferencesAztiria, A. Castillejo, E. Almeida, A. Lopez-de-Ipia, D.Adapting User Interfaces Based on User Preferences and Habits, Intelligent Environments (IE), 2014 International Conference on inside 10.1109/IE.2014.9 Publication twelvemonth 2014 , Page(s) 9 15Nivethika, M. Vithiya, I. Anntharshika, S. Deegalla, S.Personalized and adaptive user interface framework for mobile application, Advances in Computing, Communications and Informatics (ICACCI), 2013 International Conference on DOI 10.1109/ICACCI.2013.6637474, Publication Year 2013 , Page(s) 1913- 1918Jain, R. Bose, J. Arif, T. Contextual adaptive user interface for Android devices, India Confere nce (INDICON), 2013 Annual IEEE DOI 10.1109/INDCON.2013.6726014 Publication Year 2013 , Page(s) 1- 5Ye Xu et al. Preference, Context and Communities A Multi-faceted approach to Predicting Smartphone App exercise Patterns, In the 17th International Symposium on Wearable Computers (ISWC 2013). Zurich, Switzerland.Mejia-Figueroa, A. Juarez-Ramirez, R. Towards a User Model for the Design of Adaptive Interfaces for Autistic Users, Computer Software and Applications Conference Workshops (COMPSACW), 2014 IEEE thirty-eighth International DOI 10.1109/COMPSACW.2014.47 Publication Year 2014 , Page(s) 264 269Leichtenstern, K. Andre, E. User-Centred Development of Mobile Interfaces to a Pervasive Computing Environment, Advances in Computer-Human Interaction, 2008 First International Conference on DOI 10.1109/ACHI.2008.10 Publication Year 2008 , Page(s) 114 119Chang Tan, Qi Liu, Enhong Chen, Hui Xiong. Prediction for Mobile Application Usage Patterns. Nokia Mobile Data Challenge Workshop 201 2.Matthias Bhmer, Antonio Krger. A Study on figure Arrangement by Smartphone Users. In Proc. ACM SIGCHI Conference on Human Factors in Computing Systems. Paris 2013, FranceN. Andrew, Clustering with the K-Means Algorithm, video tutorialJ.S. Augusto, transcript writter, June 2012.T. D. Bie, T. T. Maia and A. P. Braga, Machine Learning with Labeled and Unlabeled Data, European Syumposium on Arti_cial Neural Networks- Advances in Computational Intelligence and Learning, Bruges, 2009.Y. Fukazawa, M. Hara, M. Onogi, H. Ueno, Automatic mobile menu customization based on user act history, 11th International Conference on HCI with Mobile Devices and Services.J. Brooke, SUS a quick and dirty usability scale. In P. W. Jordan, B. Thomas, B. A. Weerdmeester, A. L. McClelland (Eds.), Usability Evaluation in Industry (S. 189 -194). London Taylor and Francis,1996.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.