Category Archives

3 Articles
Heads up for new Fall 2018 Wear OS

Heads up for new Fall 2018 Wear OS

In mid August, Google announced that they would be releasing an updated user interface. It has already begun to roll out, but there is still a lot of information that is unknown and not told to developers and not available in a preview. Luckily, the Reddit user ntauthy was able to find a way to manually enable these existing features on some versions of Wear OS. I’m going to go over the changes that were announced for developers as well as the ones that have not been.

The following were announced by Hoi Lam on the G+ Wear OS Developers commnunity

More Concise Text

Notifications now show a smaller amount of text initially, but users can still expand the notification by tapping on it.

Set custom colors for notifications

Brand Colors

Notifications can show any color you set using the .setColor function in your notification’s builder.

No More Custom Layouts

Tapping on a notification will no longer bring up a custom layout for expanded content, but rather show the full text of a notification with additional actions.

The following changes where seen by me on the new version by manually enabling feature flags. Allow me to make the disclaimer that these are not confirmed for the official release, but are withing the Wear OS app.

Now Playing status in the quick settings panel

Quick Settings Controls

The current media session from your watch or paired device now displays in the top status bar showing the current title. A Play/Pause transport control is available. Tapping on title triggers the Content Intent of the Media Style notification if it was posted from the watch, or brings up the Media Controls activity if not.

The Media Controls Activity

Media Controls in the app drawer and open

One thing that surprised me was a new Activity available in the app drawer that brings up extended controls for the current Media Session. It displays transport controls with the volume slider and the current title. The album art displays full screen behind it with the time at the top. After enabling it, I later got a notification telling me that this activity would automatically be launched when I started playing something, but this could be turned off by long pressing on the screen while it was open. This seems to be to replace the full screen controls previously available, but allows users to disable it if they don’t like it taking up their whole screen.

The Content intent on new notifications

While in the old version tapping on a notification triggered the content intent if one was set, this is not the case for the new one. Instead, the content intent shows up as another action labeled “Open”. 

No more progress bar

New notification with no progress bar
Old notification with progress bar

With the old version, a circular progress bar could be shown around the notifications icon. This does not appear to be the case in the new version, so developers should not depend on it for showing progress.

Let me know if If am missing anything or if you have any questions.

Getting User Input on Wear OS – Part 2 – Keyboard

Getting User Input on Wear OS – Part 2 – Keyboard

In the previous post, I explained how I set up a “Get Input” fragment for my apps with a voice or keyboard option and how to get the voice input. In this post I will explain how to get input from the keyboard.

I’ve seen some apps use the RemoteInput API, but this feels jarring to me since it seems to take you out of the app. Of course, they or I could be doing this wrong. The only other way I have found in the documentation is by using the EditText. However, you may not want the user to have to select the text field, so the method I have is more similar to the Play Store where selecting the keyboard button takes you directly to the keyboard.

We will first create a fragment with just a EditText view. Wear OS will open the keyboard when the text box is selected. To achieve this, we are going to open the fragment and programatically select the EditText view.

You will want to change the IME type of the view to whatever best suits your situation. In this example, we are using it as a search. Be sure to change it in the code and the layout, otherwise the types will not match and you will not get an input back.

Here is the layout for the KeyboardInputFragment:

<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:visibility="gone">

    <EditText
        android:id="@+id/search_input"
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:imeOptions="actionSearch"
        android:inputType="text"
        android:maxLines="1"
        android:visibility="visible" />
</FrameLayout>

Notice the whole layout is set to Gone. The keyboard will take up the whole screen, so we do not need to show anything. Here is the code for the KeyboardInputFragment:

public class KeyboardInputFragment extends Fragment {

    private GetInputFragment getInputFragment;

    @Override
    public View onCreateView(LayoutInflater inflater, @Nullable ViewGroup container, Bundle savedInstanceState) {
        View view = inflater.inflate(R.layout.fragment_keyboard_input, container, false);
        final KeyboardInputFragment keyboardInputFragment = this;
        final EditText editText = view.findViewById(R.id.search_input);

        editText.setOnEditorActionListener(new TextView.OnEditorActionListener() {
            @Override
            public boolean onEditorAction(TextView textView, int actionId, KeyEvent keyEvent) {
                boolean handled = false;
                if(actionId == EditorInfo.IME_ACTION_SEARCH) {
                    getInputFragment.sendResult(editText.getText().toString());
                    handled = true;
                    getActivity().getFragmentManager().beginTransaction().remove(keyboardInputFragment)
                            .commitAllowingStateLoss();
                }
                return handled;
            }
        });

        showSoftKeyboard(editText);
        return view;
    }

    public void setGetInputFragment(GetInputFragment getInputFragment) {
        this.getInputFragment = getInputFragment;
    }

    private void showSoftKeyboard(View view) {
        if(view.requestFocus()) {
            InputMethodManager imm = (InputMethodManager) getContext().getSystemService(Context.INPUT_METHOD_SERVICE);
            if(imm != null) {
                imm.showSoftInput(view, InputMethodManager.SHOW_IMPLICIT);
                imm.toggleSoftInput(0, 0);
            } else {
                Cat.e("Couldn't open keyboard");
            }
        }
    }
}

The showSoftKeyboard function is what selects the EditText view to open the keyboard. The setGetInputFragment passes an instance of the previous fragment so that we can send the text back after we got something from the user. After we have sent the information back, we can close the fragment and the fragment from the previous post will have the necessary text to complete the action.

Hope this helps! You could probably do this in a separate activity and send the result back in an intent. I also thought extending a popup window might also work. Or am I using the Remote API wrong? I’d be interested in hearing some other solutions!

Getting user input on Wear OS – Part 1 – Voice

Google Play's input screen

Google Play’s input screen

When I first began programming for Wear OS I was expecting an easy out of the box method for getting user input similar to how Google Play works. Giving the user the option to speak or type and also a list of canned inputs to choose from.

Unfortunately, this isn’t as easy as I thought and the developer page seemed pretty vague and unhelpful, but with a little work I was able to come up with a pretty small script to accomplish what I needed to do. I will explain the basics of this in the following post.

To begin, create a layout that looks something like this:

Component tree

 

 

 

 

 

 

I use a Dismiss Layout as the wrapper so the user can swipe the input away if they change there mind and a Box Inset layout to make sure it is readable on any screen inside that. A scroll view isn’t necessary if you aren’t going to include any canned responses. I also include a microphone button and a keyboard button.

I use a Fragment class to include all the code for the input. Here’s the basic code for getting the input from the mic:

// Create an intent that can start the Speech Recognizer activity
private void displaySpeechRecognizer() {
    Intent intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
    intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,
            RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);
    // Start the activity, the intent will be populated with the speech text
    startActivityForResult(intent, SPEECH_REQUEST_CODE);
}

this is the function to run when the user clicks on the voice input button. It starts the speech recognizer intent a private integer variable can be used for the SPEECH_REQUEST_CODE so we can get the result back later by overriding onActivityResult.

@Override
public void onActivityResult(int requestCode, int resultCode, Intent data) {
    Cat.d("Got Request Code " + SPEECH_REQUEST_CODE + "  " + requestCode);
    Cat.d("Got Result Code " + resultCode + " " + RESULT_OK);
    if (requestCode == SPEECH_REQUEST_CODE) {
        try {
            List<String> results = data.getStringArrayListExtra(
                    RecognizerIntent.EXTRA_RESULTS);
            String spokenText = results.get(0);
            sendResult(spokenText);
        } catch (NullPointerException ex) {
            Cat.d("No result received");
            // If they are using android wear 1.x, back out now
            if(Build.VERSION.SDK_INT == Build.VERSION_CODES.M) {
                removeThis();
            }
        }
    }
    super.onActivityResult(requestCode, resultCode, data);
}

This function runs when a result comes back. We need to check to make sure the the code we sent matches then get the first result. There is also the possibility that there won’t be a result. If this is the case, we either do nothing or just remove the fragment if it is android wear 1 since there is not keyboard input.

In the next part, I will explain an easy way to get keyboard input without leaving the app.