Discussion:
Enabling certified app debugging on production phones.
(too old to reply)
Paul Theriault
2014-09-08 09:20:02 UTC
Permalink
Currently in order to debug certified apps (i.e. gaia apps) you need a phone which is rooted, in order to set the "devtools.debugger.forbid-certified-apps" preference to false. Having this preference set to true is required on production phones as it allows basically root-level access through the remote debugger. But it leaves the strange situation where you can install certified apps, but you can’t debug them, which isn’t particularly useful, and also means a large attack surface for an attacker with physical access.

The challenge we had when talking through this situation previously was that its difficult to distinguish between the device's owner & someone who has just found your phone, and wants to take advantage of developer mode to compromise your phone and/or data.

My team has been working on a proposal to remedy this situation:
- Introduce an “os-developer” mode
- Provide a way in FTU to have the user choose a lockscreen pass code (not necessarily enabled, just chosen)
- Add UI into developer settings to enable os-developer mode, which requires the user to enter their passcode
- When enabled, this mode allows installing and debugging certified apps. When disabled, certified app installation & debugging is forbidden.
- The user MUST set a lockscreen code during FTU for os-developer to be available. If they do not, os-developer mode is disabled, and can only be re-enabled through the process of a factory-reset then redoing FTU.
- Note that the user do not have to ENABLE the lockscreen during FTU, they just have to at least choose a passcode. But encouraging users to set a passcode comes with its own benefits.

Pros:
- Allows a way to enable developing of certified apps & Gaia hacking on production, unrooted phones while protecting the user's data

Cons:
- A user must set passcode at FTU (and remember it!), else they wont be able to use this mode without a factory reset
- In the past there has been pushback on having passcode selection in FTU

There are a lot of other details and considerations, but I’ll keep it short(er) for now to start discussion. Does anything think this is a useful change or is there a better way to enable certified app debugging, whilst protecting user data? If you are interested, there is a more detailed proposal here: [1]

Thoughts & suggestions welcome.

- Paul

[1] https://docs.google.com/a/mozilla.com/document/d/11Q1_fj2nKciVyG2PdGH_LuiH09BhZJKzFjBuIcaHKqs/edit#
Jan Jongboom
2014-09-09 12:16:33 UTC
Permalink
Currently in order to debug certified apps (i.e. gaia apps) you need a phone which is rooted, in order to set the "devtools.debugger.forbid-certified-apps" preference to false. Having this preference set to true is required on production phones as it allows basically root-level access through the remote debugger. But it leaves the strange situation where you can install certified apps, but you can't debug them, which isn't particularly useful, and also means a large attack surface for an attacker with physical access.
The challenge we had when talking through this situation previously was that its difficult to distinguish between the device's owner & someone who has just found your phone, and wants to take advantage of developer mode to compromise your phone and/or data.
- Introduce an "os-developer" mode
- Provide a way in FTU to have the user choose a lockscreen pass code (not necessarily enabled, just chosen)
- Add UI into developer settings to enable os-developer mode, which requires the user to enter their passcode
- When enabled, this mode allows installing and debugging certified apps. When disabled, certified app installation & debugging is forbidden.
- The user MUST set a lockscreen code during FTU for os-developer to be available. If they do not, os-developer mode is disabled, and can only be re-enabled through the process of a factory-reset then redoing FTU.
- Note that the user do not have to ENABLE the lockscreen during FTU, they just have to at least choose a passcode. But encouraging users to set a passcode comes with its own benefits.
- Allows a way to enable developing of certified apps & Gaia hacking on production, unrooted phones while protecting the user's data
- A user must set passcode at FTU (and remember it!), else they wont be able to use this mode without a factory reset
- In the past there has been pushback on having passcode selection in FTU
There are a lot of other details and considerations, but I'll keep it short(er) for now to start discussion. Does anything think this is a useful change or is there a better way to enable certified app debugging, whilst protecting user data? If you are interested, there is a more detailed proposal here: [1]
Thoughts & suggestions welcome.
- Paul
[1] https://docs.google.com/a/mozilla.com/document/d/11Q1_fj2nKciVyG2PdGH_LuiH09BhZJKzFjBuIcaHKqs/edit#
Wow, interesting catch. I alwasy assumed that it was not possible to install certified apps on a non-rooted phone. So yeah, any way we can make this possible on non-rooted phones will get applause.

Pin code sounds like a proper way of enabling this on consumer phones.
Jared Hirsch
2014-09-09 20:05:46 UTC
Permalink
Hi Paul,

Nice work on the proposal! I would love to see us lower the barrier to hacking on Gaia, I have some feedback below.

BTW, I work on Gaia stuff for Cloud Services; this includes Firefox Accounts, Find My Device, and prototyping work for backup/restore (though it seems other people are working on this independently, too). I'm very happy to discuss user/device security and user identity any time. I'm usually in #gaia during Pacific business hours.
Post by Jan Jongboom
Post by Paul Theriault
The challenge we had when talking through this situation previously was that its difficult to distinguish between the device's owner & someone who has just found your phone, and wants to take advantage of developer mode to compromise your phone and/or data.
Find My Device allows users to remotely lock or wipe a lost device. It shipped in 2.0.

It seems to me that FMD takes care of this particular threat (malicious person compromises lost device).

So, maybe the user doesn't need to prove device ownership before enabling certified debugging?
Post by Jan Jongboom
Post by Paul Theriault
- Introduce an "os-developer" mode
- Provide a way in FTU to have the user choose a lockscreen pass code (not necessarily enabled, just chosen)
- Add UI into developer settings to enable os-developer mode, which requires the user to enter their passcode
- When enabled, this mode allows installing and debugging certified apps. When disabled, certified app installation & debugging is forbidden.
- The user MUST set a lockscreen code during FTU for os-developer to be available. If they do not, os-developer mode is disabled, and can only be re-enabled through the process of a factory-reset then redoing FTU.
- Note that the user do not have to ENABLE the lockscreen during FTU, they just have to at least choose a passcode. But encouraging users to set a passcode comes with its own benefits.
The "developer PIN" concept and UX seem quite complex.

What if we just add an "enable certified app debugging" checkbox to the developer menu?

The two goals in the linked google doc are (1) manage security risk from lost devices, see my FMD comments above; and (2) give users full access if they want to hack on Gaia. I think my counterproposal here enables (2) with a lot less work, and reduced barrier to user experimentation (no passcode, no need to factory reset if you didn't set a flag during FTU).

Cheers,

Jared
Post by Jan Jongboom
Post by Paul Theriault
- Allows a way to enable developing of certified apps & Gaia hacking on production, unrooted phones while protecting the user's data
- A user must set passcode at FTU (and remember it!), else they wont be able to use this mode without a factory reset
- In the past there has been pushback on having passcode selection in FTU
There are a lot of other details and considerations, but I'll keep it short(er) for now to start discussion. Does anything think this is a useful change or is there a better way to enable certified app debugging, whilst protecting user data? If you are interested, there is a more detailed proposal here: [1]
Thoughts & suggestions welcome.
- Paul
[1] https://docs.google.com/a/mozilla.com/document/d/11Q1_fj2nKciVyG2PdGH_LuiH09BhZJKzFjBuIcaHKqs/edit#
Wow, interesting catch. I alwasy assumed that it was not possible to install certified apps on a non-rooted phone. So yeah, any way we can make this possible on non-rooted phones will get applause.
Pin code sounds like a proper way of enabling this on consumer phones.
_______________________________________________
dev-b2g mailing list
https://lists.mozilla.org/listinfo/dev-b2g
Jan Jongboom
2014-09-09 20:31:46 UTC
Permalink
Well you need to enforce PIN because otherwise everyone who finds your
phone can grab all the data, or you should wipe it out whenever someone
enables that menu but you don't want that either I'd say.
Post by Jared Hirsch
Hi Paul,
Nice work on the proposal! I would love to see us lower the barrier to
hacking on Gaia, I have some feedback below.
BTW, I work on Gaia stuff for Cloud Services; this includes Firefox
Accounts, Find My Device, and prototyping work for backup/restore (though
it seems other people are working on this independently, too). I'm very
happy to discuss user/device security and user identity any time. I'm
usually in #gaia during Pacific business hours.
Post by Jan Jongboom
Post by Paul Theriault
The challenge we had when talking through this situation previously was
that its difficult to distinguish between the device's owner & someone who
has just found your phone, and wants to take advantage of developer mode to
compromise your phone and/or data.
Find My Device allows users to remotely lock or wipe a lost device. It shipped in 2.0.
It seems to me that FMD takes care of this particular threat (malicious
person compromises lost device).
So, maybe the user doesn't need to prove device ownership before enabling
certified debugging?
Post by Jan Jongboom
Post by Paul Theriault
- Introduce an "os-developer" mode
- Provide a way in FTU to have the user choose a lockscreen pass code
(not necessarily enabled, just chosen)
Post by Jan Jongboom
Post by Paul Theriault
- Add UI into developer settings to enable os-developer mode, which
requires the user to enter their passcode
Post by Jan Jongboom
Post by Paul Theriault
- When enabled, this mode allows installing and debugging certified
apps. When disabled, certified app installation & debugging is forbidden.
Post by Jan Jongboom
Post by Paul Theriault
- The user MUST set a lockscreen code during FTU for os-developer to be
available. If they do not, os-developer mode is disabled, and can only be
re-enabled through the process of a factory-reset then redoing FTU.
Post by Jan Jongboom
Post by Paul Theriault
- Note that the user do not have to ENABLE the lockscreen during FTU,
they just have to at least choose a passcode. But encouraging users to set
a passcode comes with its own benefits.
The "developer PIN" concept and UX seem quite complex.
What if we just add an "enable certified app debugging" checkbox to the developer menu?
The two goals in the linked google doc are (1) manage security risk from
lost devices, see my FMD comments above; and (2) give users full access if
they want to hack on Gaia. I think my counterproposal here enables (2) with
a lot less work, and reduced barrier to user experimentation (no passcode,
no need to factory reset if you didn't set a flag during FTU).
Cheers,
Jared
Post by Jan Jongboom
Post by Paul Theriault
- Allows a way to enable developing of certified apps & Gaia hacking on
production, unrooted phones while protecting the user's data
Post by Jan Jongboom
Post by Paul Theriault
- A user must set passcode at FTU (and remember it!), else they wont
be able to use this mode without a factory reset
Post by Jan Jongboom
Post by Paul Theriault
- In the past there has been pushback on having passcode selection in
FTU
Post by Jan Jongboom
Post by Paul Theriault
There are a lot of other details and considerations, but I'll keep it
short(er) for now to start discussion. Does anything think this is a useful
change or is there a better way to enable certified app debugging, whilst
protecting user data? If you are interested, there is a more detailed
proposal here: [1]
Post by Jan Jongboom
Post by Paul Theriault
Thoughts & suggestions welcome.
- Paul
[1]
https://docs.google.com/a/mozilla.com/document/d/11Q1_fj2nKciVyG2PdGH_LuiH09BhZJKzFjBuIcaHKqs/edit#
Post by Jan Jongboom
Wow, interesting catch. I alwasy assumed that it was not possible to
install certified apps on a non-rooted phone. So yeah, any way we can make
this possible on non-rooted phones will get applause.
Post by Jan Jongboom
Pin code sounds like a proper way of enabling this on consumer phones.
_______________________________________________
dev-b2g mailing list
https://lists.mozilla.org/listinfo/dev-b2g
Jared Hirsch
2014-09-09 20:49:29 UTC
Permalink
Well you need to enforce PIN because otherwise everyone who finds your phone can grab all the data, or you should wipe it out whenever someone enables that menu but you don't want that either I'd say.
Actually, I think we're fine here - I don't see how this differs from the threat mentioned already.

If an attacker finds your phone, then presumably you have lost it. In that case, you can use FMD to lock or wipe it remotely.

I'm specifically suggesting that we don't need the "developer PIN", but maybe you're referring to the lockscreen PIN? I definitely do think the users should have their lockscreen PIN enabled ^_^
Hi Paul,
Nice work on the proposal! I would love to see us lower the barrier to hacking on Gaia, I have some feedback below.
BTW, I work on Gaia stuff for Cloud Services; this includes Firefox Accounts, Find My Device, and prototyping work for backup/restore (though it seems other people are working on this independently, too). I'm very happy to discuss user/device security and user identity any time. I'm usually in #gaia during Pacific business hours.
Post by Jan Jongboom
Post by Paul Theriault
The challenge we had when talking through this situation previously was that its difficult to distinguish between the device's owner & someone who has just found your phone, and wants to take advantage of developer mode to compromise your phone and/or data.
Find My Device allows users to remotely lock or wipe a lost device. It shipped in 2.0.
It seems to me that FMD takes care of this particular threat (malicious person compromises lost device).
So, maybe the user doesn't need to prove device ownership before enabling certified debugging?
Post by Jan Jongboom
Post by Paul Theriault
- Introduce an "os-developer" mode
- Provide a way in FTU to have the user choose a lockscreen pass code (not necessarily enabled, just chosen)
- Add UI into developer settings to enable os-developer mode, which requires the user to enter their passcode
- When enabled, this mode allows installing and debugging certified apps. When disabled, certified app installation & debugging is forbidden.
- The user MUST set a lockscreen code during FTU for os-developer to be available. If they do not, os-developer mode is disabled, and can only be re-enabled through the process of a factory-reset then redoing FTU.
- Note that the user do not have to ENABLE the lockscreen during FTU, they just have to at least choose a passcode. But encouraging users to set a passcode comes with its own benefits.
The "developer PIN" concept and UX seem quite complex.
What if we just add an "enable certified app debugging" checkbox to the developer menu?
The two goals in the linked google doc are (1) manage security risk from lost devices, see my FMD comments above; and (2) give users full access if they want to hack on Gaia. I think my counterproposal here enables (2) with a lot less work, and reduced barrier to user experimentation (no passcode, no need to factory reset if you didn't set a flag during FTU).
Cheers,
Jared
Post by Jan Jongboom
Post by Paul Theriault
- Allows a way to enable developing of certified apps & Gaia hacking on production, unrooted phones while protecting the user's data
- A user must set passcode at FTU (and remember it!), else they wont be able to use this mode without a factory reset
- In the past there has been pushback on having passcode selection in FTU
There are a lot of other details and considerations, but I'll keep it short(er) for now to start discussion. Does anything think this is a useful change or is there a better way to enable certified app debugging, whilst protecting user data? If you are interested, there is a more detailed proposal here: [1]
Thoughts & suggestions welcome.
- Paul
[1] https://docs.google.com/a/mozilla.com/document/d/11Q1_fj2nKciVyG2PdGH_LuiH09BhZJKzFjBuIcaHKqs/edit#
Wow, interesting catch. I alwasy assumed that it was not possible to install certified apps on a non-rooted phone. So yeah, any way we can make this possible on non-rooted phones will get applause.
Pin code sounds like a proper way of enabling this on consumer phones.
_______________________________________________
dev-b2g mailing list
https://lists.mozilla.org/listinfo/dev-b2g
Paul Theriault
2014-09-10 03:17:30 UTC
Permalink
Post by Jared Hirsch
Well you need to enforce PIN because otherwise everyone who finds your phone can grab all the data, or you should wipe it out whenever someone enables that menu but you don't want that either I'd say.
Actually, I think we're fine here - I don't see how this differs from the threat mentioned already.
If an attacker finds your phone, then presumably you have lost it. In that case, you can use FMD to lock or wipe it remotely.
If you actually set up FMD. If it has battery. If it has network connectivity etc.
Post by Jared Hirsch
I'm specifically suggesting that we don't need the "developer PIN", but maybe you're referring to the lockscreen PIN? I definitely do think the users should have their lockscreen PIN enabled ^_^
Yeh I agree that is confusing in the doc - we talk about a developer pin code which is kept in sync with the lockscreen passcode. Overly complex i think. It should just BE the lockscreen passcode.
Post by Jared Hirsch
Hi Paul,
Nice work on the proposal! I would love to see us lower the barrier to hacking on Gaia, I have some feedback below.
BTW, I work on Gaia stuff for Cloud Services; this includes Firefox Accounts, Find My Device, and prototyping work for backup/restore (though it seems other people are working on this independently, too). I'm very happy to discuss user/device security and user identity any time. I'm usually in #gaia during Pacific business hours.
Post by Jan Jongboom
Post by Paul Theriault
The challenge we had when talking through this situation previously was that its difficult to distinguish between the device's owner & someone who has just found your phone, and wants to take advantage of developer mode to compromise your phone and/or data.
Find My Device allows users to remotely lock or wipe a lost device. It shipped in 2.0.
It seems to me that FMD takes care of this particular threat (malicious person compromises lost device).
So, maybe the user doesn't need to prove device ownership before enabling certified debugging?
Post by Jan Jongboom
Post by Paul Theriault
- Introduce an "os-developer" mode
- Provide a way in FTU to have the user choose a lockscreen pass code (not necessarily enabled, just chosen)
- Add UI into developer settings to enable os-developer mode, which requires the user to enter their passcode
- When enabled, this mode allows installing and debugging certified apps. When disabled, certified app installation & debugging is forbidden.
- The user MUST set a lockscreen code during FTU for os-developer to be available. If they do not, os-developer mode is disabled, and can only be re-enabled through the process of a factory-reset then redoing FTU.
- Note that the user do not have to ENABLE the lockscreen during FTU, they just have to at least choose a passcode. But encouraging users to set a passcode comes with its own benefits.
The "developer PIN" concept and UX seem quite complex.
What if we just add an "enable certified app debugging" checkbox to the developer menu?
The two goals in the linked google doc are (1) manage security risk from lost devices, see my FMD comments above; and (2) give users full access if they want to hack on Gaia. I think my counterproposal here enables (2) with a lot less work, and reduced barrier to user experimentation (no passcode, no need to factory reset if you didn't set a flag during FTU).
Cheers,
Jared
Post by Jan Jongboom
Post by Paul Theriault
- Allows a way to enable developing of certified apps & Gaia hacking on production, unrooted phones while protecting the user's data
- A user must set passcode at FTU (and remember it!), else they wont be able to use this mode without a factory reset
- In the past there has been pushback on having passcode selection in FTU
There are a lot of other details and considerations, but I'll keep it short(er) for now to start discussion. Does anything think this is a useful change or is there a better way to enable certified app debugging, whilst protecting user data? If you are interested, there is a more detailed proposal here: [1]
Thoughts & suggestions welcome.
- Paul
[1] https://docs.google.com/a/mozilla.com/document/d/11Q1_fj2nKciVyG2PdGH_LuiH09BhZJKzFjBuIcaHKqs/edit#
Wow, interesting catch. I alwasy assumed that it was not possible to install certified apps on a non-rooted phone. So yeah, any way we can make this possible on non-rooted phones will get applause.
Pin code sounds like a proper way of enabling this on consumer phones.
_______________________________________________
dev-b2g mailing list
https://lists.mozilla.org/listinfo/dev-b2g
_______________________________________________
dev-b2g mailing list
https://lists.mozilla.org/listinfo/dev-b2g
Jan Jongboom
2014-09-10 07:46:48 UTC
Permalink
+1
Post by Jan Jongboom
Post by Jared Hirsch
Post by Jan Jongboom
Well you need to enforce PIN because otherwise everyone who finds your
phone can grab all the data, or you should wipe it out whenever someone
enables that menu but you don't want that either I'd say.
Post by Jared Hirsch
Actually, I think we're fine here - I don't see how this differs from
the threat mentioned already.
Post by Jared Hirsch
If an attacker finds your phone, then presumably you have lost it. In
that case, you can use FMD to lock or wipe it remotely.
If you actually set up FMD. If it has battery. If it has network connectivity etc.
Post by Jared Hirsch
I'm specifically suggesting that we don't need the "developer PIN", but
maybe you're referring to the lockscreen PIN? I definitely do think the
users should have their lockscreen PIN enabled ^_^
Yeh I agree that is confusing in the doc - we talk about a developer pin
code which is kept in sync with the lockscreen passcode. Overly complex i
think. It should just BE the lockscreen passcode.
Post by Jared Hirsch
Post by Jan Jongboom
Hi Paul,
Nice work on the proposal! I would love to see us lower the barrier to
hacking on Gaia, I have some feedback below.
Post by Jared Hirsch
Post by Jan Jongboom
BTW, I work on Gaia stuff for Cloud Services; this includes Firefox
Accounts, Find My Device, and prototyping work for backup/restore (though
it seems other people are working on this independently, too). I'm very
happy to discuss user/device security and user identity any time. I'm
usually in #gaia during Pacific business hours.
Post by Jared Hirsch
Post by Jan Jongboom
Post by Jan Jongboom
Post by Paul Theriault
The challenge we had when talking through this situation previously
was that its difficult to distinguish between the device's owner & someone
who has just found your phone, and wants to take advantage of developer
mode to compromise your phone and/or data.
Post by Jared Hirsch
Post by Jan Jongboom
Find My Device allows users to remotely lock or wipe a lost device. It
shipped in 2.0.
Post by Jared Hirsch
Post by Jan Jongboom
It seems to me that FMD takes care of this particular threat (malicious
person compromises lost device).
Post by Jared Hirsch
Post by Jan Jongboom
So, maybe the user doesn't need to prove device ownership before
enabling certified debugging?
Post by Jared Hirsch
Post by Jan Jongboom
Post by Jan Jongboom
Post by Paul Theriault
- Introduce an "os-developer" mode
- Provide a way in FTU to have the user choose a lockscreen pass code
(not necessarily enabled, just chosen)
Post by Jared Hirsch
Post by Jan Jongboom
Post by Jan Jongboom
Post by Paul Theriault
- Add UI into developer settings to enable os-developer mode, which
requires the user to enter their passcode
Post by Jared Hirsch
Post by Jan Jongboom
Post by Jan Jongboom
Post by Paul Theriault
- When enabled, this mode allows installing and debugging certified
apps. When disabled, certified app installation & debugging is forbidden.
Post by Jared Hirsch
Post by Jan Jongboom
Post by Jan Jongboom
Post by Paul Theriault
- The user MUST set a lockscreen code during FTU for os-developer to
be available. If they do not, os-developer mode is disabled, and can only
be re-enabled through the process of a factory-reset then redoing FTU.
Post by Jared Hirsch
Post by Jan Jongboom
Post by Jan Jongboom
Post by Paul Theriault
- Note that the user do not have to ENABLE the lockscreen during FTU,
they just have to at least choose a passcode. But encouraging users to set
a passcode comes with its own benefits.
Post by Jared Hirsch
Post by Jan Jongboom
The "developer PIN" concept and UX seem quite complex.
What if we just add an "enable certified app debugging" checkbox to the
developer menu?
Post by Jared Hirsch
Post by Jan Jongboom
The two goals in the linked google doc are (1) manage security risk
from lost devices, see my FMD comments above; and (2) give users full
access if they want to hack on Gaia. I think my counterproposal here
enables (2) with a lot less work, and reduced barrier to user
experimentation (no passcode, no need to factory reset if you didn't set a
flag during FTU).
Post by Jared Hirsch
Post by Jan Jongboom
Cheers,
Jared
Post by Jan Jongboom
Post by Paul Theriault
- Allows a way to enable developing of certified apps & Gaia hacking
on production, unrooted phones while protecting the user's data
Post by Jared Hirsch
Post by Jan Jongboom
Post by Jan Jongboom
Post by Paul Theriault
- A user must set passcode at FTU (and remember it!), else they wont
be able to use this mode without a factory reset
Post by Jared Hirsch
Post by Jan Jongboom
Post by Jan Jongboom
Post by Paul Theriault
- In the past there has been pushback on having passcode selection in
FTU
Post by Jared Hirsch
Post by Jan Jongboom
Post by Jan Jongboom
Post by Paul Theriault
There are a lot of other details and considerations, but I'll keep it
short(er) for now to start discussion. Does anything think this is a useful
change or is there a better way to enable certified app debugging, whilst
protecting user data? If you are interested, there is a more detailed
proposal here: [1]
Post by Jared Hirsch
Post by Jan Jongboom
Post by Jan Jongboom
Post by Paul Theriault
Thoughts & suggestions welcome.
- Paul
[1]
https://docs.google.com/a/mozilla.com/document/d/11Q1_fj2nKciVyG2PdGH_LuiH09BhZJKzFjBuIcaHKqs/edit#
Post by Jared Hirsch
Post by Jan Jongboom
Post by Jan Jongboom
Wow, interesting catch. I alwasy assumed that it was not possible to
install certified apps on a non-rooted phone. So yeah, any way we can make
this possible on non-rooted phones will get applause.
Post by Jared Hirsch
Post by Jan Jongboom
Post by Jan Jongboom
Pin code sounds like a proper way of enabling this on consumer phones.
_______________________________________________
dev-b2g mailing list
https://lists.mozilla.org/listinfo/dev-b2g
_______________________________________________
dev-b2g mailing list
https://lists.mozilla.org/listinfo/dev-b2g
Jared Hirsch
2014-09-10 18:05:03 UTC
Permalink
Post by Paul Theriault
Post by Jared Hirsch
I'm specifically suggesting that we don't need the "developer PIN", but maybe you're referring to the lockscreen PIN? I definitely do think the users should have their lockscreen PIN enabled ^_^
Yeh I agree that is confusing in the doc - we talk about a developer pin code which is kept in sync with the lockscreen passcode. Overly complex i think. It should just BE the lockscreen passcode.
Great! Yeah, I found it hard to understand when lockscreen passcode changes would/wouldn't also change the developer pin.

In that case, do you still think it's necessary to require the passcode be set during FTU?
Stephanie Ouillon
2014-09-10 18:16:12 UTC
Permalink
Post by Jared Hirsch
Post by Paul Theriault
Post by Jared Hirsch
I'm specifically suggesting that we don't need the "developer PIN", but
maybe you're referring to the lockscreen PIN? I definitely do think the
users should have their lockscreen PIN enabled ^_^
Yeh I agree that is confusing in the doc - we talk about a developer pin
code which is kept in sync with the lockscreen passcode. Overly complex i
think. It should just BE the lockscreen passcode.
Great! Yeah, I found it hard to understand when lockscreen passcode changes
would/wouldn't also change the developer pin.
In that case, do you still think it's necessary to require the passcode be set during FTU?
Yes, because a user is not forced to set a PIN code for the the lockscreen.
In the case it is not set, an attacker could just go in the settings,
set his own PIN code, and enable debugging.

Kartikaya Gupta
2014-09-09 13:00:12 UTC
Permalink
Post by Paul Theriault
The challenge we had when talking through this situation previously was that its difficult to distinguish between the device's owner & someone who has just found your phone, and wants to take advantage of developer mode to compromise your phone and/or data.
Thanks for pointing this out, as it is an important distinction that is
the heart of the problem.
Post by Paul Theriault
- A user must set passcode at FTU (and remember it!), else they wont be able to use this mode without a factory reset
When they do a factory reset, is there a mechanism available for them to
backup and restore their data? (I admit I'm unfamiliar with what the
average user would use for this - a quick search online seems to
indicate you have to use adb to do this). If there is a mechanism, what
prevents the "malicious person who just found your phone" from doing
this data backup and stealing your data? Is this somehow a less-bad
scenario than the malicious person being able to enable os-developer mode?

I just worry that forcing a factory reset in this scenario is going to
place a big barrier to allowing our users to organically grow from
"users" to "webmaker". That is, they will find it much harder to learn
and hack their phones in ways that we should be should be actively
encouraging.

Seeing as the heart of the problem is distinguishing the device owner
and Mr. Malicious, perhaps we could ask for some piece of information
the device owner is much more likely to have. The SIM PIN might be such
a thing, or maybe some other unique identifier that comes with the phone
but isn't physically present or accessible on the handset itself.

Cheers,
kats
Stéphanie Ouillon
2014-09-09 13:53:11 UTC
Permalink
Hi,
Post by Kartikaya Gupta
Post by Paul Theriault
The challenge we had when talking through this situation previously
was that its difficult to distinguish between the device's owner &
someone who has just found your phone, and wants to take advantage of
developer mode to compromise your phone and/or data.
Thanks for pointing this out, as it is an important distinction that is
the heart of the problem.
Post by Paul Theriault
- A user must set passcode at FTU (and remember it!), else they wont
be able to use this mode without a factory reset
When they do a factory reset, is there a mechanism available for them to
backup and restore their data? (I admit I'm unfamiliar with what the
average user would use for this - a quick search online seems to
indicate you have to use adb to do this). If there is a mechanism, what
prevents the "malicious person who just found your phone" from doing
this data backup and stealing your data? Is this somehow a less-bad
scenario than the malicious person being able to enable os-developer mode?
Definitely not, since what we want to achieve ultimately is protecting
the user's data. But I don't know the details of the possible solutions
for the backup and restore mechanism, so I'll let better informed people
answer this.
Post by Kartikaya Gupta
I just worry that forcing a factory reset in this scenario is going to
place a big barrier to allowing our users to organically grow from
"users" to "webmaker". That is, they will find it much harder to learn
and hack their phones in ways that we should be should be actively
encouraging.
This 'os-developer' mode is meant for people who want to write and debug
certified apps. This factory reset scenario won't impact web app
developers (privileged, web). Are would-be Gaia developers the target
you're concerned about?
Post by Kartikaya Gupta
Seeing as the heart of the problem is distinguishing the device owner
and Mr. Malicious, perhaps we could ask for some piece of information
the device owner is much more likely to have. The SIM PIN might be such
a thing, or maybe some other unique identifier that comes with the phone
but isn't physically present or accessible on the handset itself.
Since the SIM can be removed and replaced by the attacker's SIM, it
doesn't look like a right candidate. That's why we consider the device
PIN code instead.
The issue we're hitting is always the same: how to make sure it's the
actual owner of the device who is initializing _first_ the
authentication service (setting a PIN code, synchronizing to a backup
service, etc) while protecting the data. Hence the reset factory solution...


Stéphanie
Dale Harvey
2014-09-09 14:25:28 UTC
Permalink
I am likely ignorant about the reasoning behind some of the security
decisions made for our device, however I have been fustrated by them, in
previous lives doing android and ios development I have found fxos fairly
similiarly fustrating as ios between different areas, and android
comparatively a huge amount easier.

Its possible I am entirely off base and there are reasons we cant make life
this easy, but at least having them explained would maybe help ease the
fustration.

With android development, I get my device (user build), enable the
developer menu (it used to be tap a button 7 times, now its shake it I
think), turn on debugging, when my phone connects to my computer I accept a
prompt that says that computer is allowed to access my device, at that
point I have unfetered access and adb continues to work no matter if my
screen is switched off or restarted or is in various other states in which
we lose adb access.

If my phone has a pin then the only way to get adb access is to get access
to my device unlocked, at that point all bets are off which I think is
entirely reasonable.

Even in development builds but particularly with user builds adb access to
the device is extremely flakey and I routinely have to go into fastboot
mode to reflash my entire device, like if I push a syntax error in the
system app in a user build then adb will never be reenabled.

Having to do a factory reset, or as was mentioned in the google doc sign up
to some firefox online account seems straight up developer hostile to me.
Post by Stéphanie Ouillon
Hi,
Post by Kartikaya Gupta
Post by Paul Theriault
The challenge we had when talking through this situation previously
was that its difficult to distinguish between the device's owner &
someone who has just found your phone, and wants to take advantage of
developer mode to compromise your phone and/or data.
Thanks for pointing this out, as it is an important distinction that is
the heart of the problem.
Post by Paul Theriault
- A user must set passcode at FTU (and remember it!), else they wont
be able to use this mode without a factory reset
When they do a factory reset, is there a mechanism available for them to
backup and restore their data? (I admit I'm unfamiliar with what the
average user would use for this - a quick search online seems to
indicate you have to use adb to do this). If there is a mechanism, what
prevents the "malicious person who just found your phone" from doing
this data backup and stealing your data? Is this somehow a less-bad
scenario than the malicious person being able to enable os-developer
mode?
Definitely not, since what we want to achieve ultimately is protecting
the user's data. But I don't know the details of the possible solutions
for the backup and restore mechanism, so I'll let better informed people
answer this.
Post by Kartikaya Gupta
I just worry that forcing a factory reset in this scenario is going to
place a big barrier to allowing our users to organically grow from
"users" to "webmaker". That is, they will find it much harder to learn
and hack their phones in ways that we should be should be actively
encouraging.
This 'os-developer' mode is meant for people who want to write and debug
certified apps. This factory reset scenario won't impact web app
developers (privileged, web). Are would-be Gaia developers the target
you're concerned about?
Post by Kartikaya Gupta
Seeing as the heart of the problem is distinguishing the device owner
and Mr. Malicious, perhaps we could ask for some piece of information
the device owner is much more likely to have. The SIM PIN might be such
a thing, or maybe some other unique identifier that comes with the phone
but isn't physically present or accessible on the handset itself.
Since the SIM can be removed and replaced by the attacker's SIM, it
doesn't look like a right candidate. That's why we consider the device
PIN code instead.
The issue we're hitting is always the same: how to make sure it's the
actual owner of the device who is initializing _first_ the
authentication service (setting a PIN code, synchronizing to a backup
service, etc) while protecting the data. Hence the reset factory solution...
Stéphanie
_______________________________________________
dev-b2g mailing list
https://lists.mozilla.org/listinfo/dev-b2g
Stéphanie Ouillon
2014-09-09 15:28:13 UTC
Permalink
Post by Dale Harvey
Even in development builds but particularly with user builds adb access
to the device is extremely flakey and I routinely have to go into
fastboot mode to reflash my entire device, like if I push a syntax error
in the system app in a user build then adb will never be reenabled.
Sounds like an issue unrelated with security decisions, but I see your
point. To understand a bit more the UX flow: if you flash your phone,
then either

1) you have a standard build, and you have to go through the FTU. You
can then enable the 'os-developer' mode without having to add too much
steps (well, this is a matter of discussion of course). And since I'm
not sure if it was stated clearly enough in the first email: you would
need to go through this procedure only once, should you remember the PIN
code you set the first time.

2) or you flash your custom builds, in which case maybe it would be
possible to set a pref to enable os-developer mode by default if you
already configured it to skip the FTU, activate adb, etc?
Post by Dale Harvey
Having to do a factory reset, or as was mentioned in the google doc sign
up to some firefox online account seems straight up developer hostile to me.
The thing is, if the use case is a developer who is often flashing his
device to do testing or development, then the security risk related to
(security sensitive) data loss is pretty low, imho.
But if you're an attacker or a lambda user deciding to turn your
everyday phone into the os-developer mode (which currently is done
through the same routine of stopping/starting b2g via adb and setting a
pref, assuming you have a rooted device), then it's another story.


As a side note, for people debugging with the help of the App
Manager/WebIDE, the devtools team is working on implementing remote
debugging over Wi-Fi (without using adb at all) [1] .

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=962308
Paul Theriault
2014-09-10 03:14:18 UTC
Permalink
I am likely ignorant about the reasoning behind some of the security decisions made for our device, however I have been fustrated by them, in previous lives doing android and ios development I have found fxos fairly similiarly fustrating as ios between different areas, and android comparatively a huge amount easier.
Its possible I am entirely off base and there are reasons we cant make life this easy, but at least having them explained would maybe help ease the fustration.
With android development, I get my device (user build), enable the developer menu (it used to be tap a button 7 times, now its shake it I think), turn on debugging, when my phone connects to my computer I accept a prompt that says that computer is allowed to access my device, at that point I have unfetered access and adb continues to work no matter if my screen is switched off or restarted or is in various other states in which we lose adb access.
This is useful feedback but i think adb access is separate to the discussion here. The only reason to prevent new ADB connections from a security perspective is when the device is locked with a passcode, so anything else we should consider a bug and fix. Which I think would alleviate most of your pain points above.

But that is a little separate to what I was proposing.

What I am talking about here is access to debug Firefox OS itself, with requiring the phone to be rooted, but whilst also protecting user data.

To debug the main process or certified apps, you currently you need root access, which means you have to have access to, and flash a build with root enabled. Im trying to improve this situation, by providing a way to get this access, but while also protecting the user data which is only protected by virtue of root access not being available.
If my phone has a pin then the only way to get adb access is to get access to my device unlocked, at that point all bets are off which I think is entirely reasonable.
Even in development builds but particularly with user builds adb access to the device is extremely flakey and I routinely have to go into fastboot mode to reflash my entire device, like if I push a syntax error in the system app in a user build then adb will never be reenabled.
Having to do a factory reset, or as was mentioned in the google doc sign up to some firefox online account seems straight up developer hostile to me.
I don’t see why? Enabling this level of debugging is equivalent to rooting. Im struggling to see how providing a way for developers to have effectively root access production phones MORE hostile than a situation where its currently not possible.

AFAIK Android does exactly the same thing btw, with "fastboot oem unlock” when rooting.
Hi,
Post by Kartikaya Gupta
Post by Paul Theriault
The challenge we had when talking through this situation previously
was that its difficult to distinguish between the device's owner &
someone who has just found your phone, and wants to take advantage of
developer mode to compromise your phone and/or data.
Thanks for pointing this out, as it is an important distinction that is
the heart of the problem.
Post by Paul Theriault
- A user must set passcode at FTU (and remember it!), else they wont
be able to use this mode without a factory reset
When they do a factory reset, is there a mechanism available for them to
backup and restore their data? (I admit I'm unfamiliar with what the
average user would use for this - a quick search online seems to
indicate you have to use adb to do this). If there is a mechanism, what
prevents the "malicious person who just found your phone" from doing
this data backup and stealing your data? Is this somehow a less-bad
scenario than the malicious person being able to enable os-developer mode?
Definitely not, since what we want to achieve ultimately is protecting
the user's data. But I don't know the details of the possible solutions
for the backup and restore mechanism, so I'll let better informed people
answer this.
Post by Kartikaya Gupta
I just worry that forcing a factory reset in this scenario is going to
place a big barrier to allowing our users to organically grow from
"users" to "webmaker". That is, they will find it much harder to learn
and hack their phones in ways that we should be should be actively
encouraging.
This 'os-developer' mode is meant for people who want to write and debug
certified apps. This factory reset scenario won't impact web app
developers (privileged, web). Are would-be Gaia developers the target
you're concerned about?
Post by Kartikaya Gupta
Seeing as the heart of the problem is distinguishing the device owner
and Mr. Malicious, perhaps we could ask for some piece of information
the device owner is much more likely to have. The SIM PIN might be such
a thing, or maybe some other unique identifier that comes with the phone
but isn't physically present or accessible on the handset itself.
Since the SIM can be removed and replaced by the attacker's SIM, it
doesn't look like a right candidate. That's why we consider the device
PIN code instead.
The issue we're hitting is always the same: how to make sure it's the
actual owner of the device who is initializing _first_ the
authentication service (setting a PIN code, synchronizing to a backup
service, etc) while protecting the data. Hence the reset factory solution...
Stéphanie
_______________________________________________
dev-b2g mailing list
https://lists.mozilla.org/listinfo/dev-b2g
_______________________________________________
dev-b2g mailing list
https://lists.mozilla.org/listinfo/dev-b2g
Fabrice Desré
2014-09-09 16:00:56 UTC
Permalink
Post by Stéphanie Ouillon
Definitely not, since what we want to achieve ultimately is protecting
the user's data. But I don't know the details of the possible solutions
for the backup and restore mechanism, so I'll let better informed people
answer this.
Right now we don't have any good solution. We discussed possible use
case/api on this mailing list already, and maybe we'll have time to
implement for the next release!

Fabrice
--
Fabrice Desré
b2g team
Mozilla Corporation
Kartikaya Gupta
2014-09-09 17:34:29 UTC
Permalink
Post by Fabrice Desré
Post by Stéphanie Ouillon
Definitely not, since what we want to achieve ultimately is protecting
the user's data. But I don't know the details of the possible solutions
for the backup and restore mechanism, so I'll let better informed people
answer this.
Right now we don't have any good solution. We discussed possible use
case/api on this mailing list already, and maybe we'll have time to
implement for the next release!
Ok, it just feels premature to me to make a decision on requiring a
factory reset without having decided how the user will do a
backup/restore. Particularly if the backup/restore allows malicious
people to grab the user data anyway.

kats
Kartikaya Gupta
2014-09-09 17:32:49 UTC
Permalink
Post by Stéphanie Ouillon
Post by Kartikaya Gupta
I just worry that forcing a factory reset in this scenario is going to
place a big barrier to allowing our users to organically grow from
"users" to "webmaker". That is, they will find it much harder to learn
and hack their phones in ways that we should be should be actively
encouraging.
This 'os-developer' mode is meant for people who want to write and debug
certified apps. This factory reset scenario won't impact web app
developers (privileged, web). Are would-be Gaia developers the target
you're concerned about?
I'm concerned about all users. It may seem like the number of users who
would want to debug certified apps is small, but consider that many
developers start because of the "scratch your own itch" paradigm - that
is, many developers start digging, debugging and hacking because there's
some deficiency in the app that bothers them and that they actually want
to fix. Given that most of their interaction will be with the core
built-in apps, which are mostly (entirely?) certified apps, it makes
sense that most of the itches they will want to scratch will be in this
category.
Post by Stéphanie Ouillon
Since the SIM can be removed and replaced by the attacker's SIM, it
doesn't look like a right candidate. That's why we consider the device
PIN code instead.
Good point, the SIM is probably not the right thing then. I don't have
any better ideas :(

kats
Andrew Sutherland
2014-09-10 05:30:49 UTC
Permalink
This seems like a good idea, but I think the approach may not go far
enough. I have some suggestions.

I think there are a few scenarios that interact with the proposed
functionality:
1: Lost, locked device found by a nefarious person with no plans to
return it
2: Device in the possession of a nefarious person who intends a
persistent attack on the owner of the device.

1) In the forever-lost-to-evil case we want the attacker to not be able
to get at the data, so wiping the data as a prerequisite to gaining
root-ish access seems absolutely correct. And as Paul notes, it's quite
conceivable for even a limited-capability attacker to keep the device
out-of-touch via network, etc., making remote wipe insufficient on its
own. Of course, since the data is not encrypted on the flash, we will
lose to more capable attackers at this time.

2) In the root-and-return case we want the true owner of the device to
be able to know that their device has been tampered with. The problem
is that once a device has been rooted, the device can no longer be
trusted to indicate that it's been tampered with at this time.
Obviously if we do enough trusted-computing stuff we could get the boot
process to indicate rooting (a Firefox with a robber's mask!), but I
think we're pretty far from that right now.

Nuking the user's data is an excellent indicator of potential
tampering. But that's not reliably being proposed here. The current
proposal as I read it asks the user "want a lock screen code?" at
first-run but the actual decision is "want a lock screen code and for it
also to enable a super-powerful developer mode that could let someone
persistently pwn your device?"

It seems like it would be better to just ask the user outright whether
they'd like to enable super-dangerous debugging mode and how they'd like
to secure that debugging mode. For example, I would personally prefer
that the code that would let an attacker persistently root my device be
more sophisticated than the 4-digit code I'm potentially typing in front
of everyone all the time and smudging onto the screen with my
fingerprints. Or, since people hate FTU screens, make it an opt-in
under an "advanced..." or "developer..." call-out in the flow that will
catch the eye of developers/tinkerers but not infuriate most users.


There are other possible tamper indication options, and using those
could let the user upgrade to "super developer mode" without nuking all
their data. For example, if the device is bound to a Firefox account,
the Mozilla server could potentially generate a rooting-unlock
authorization for the device if-and-only-if the account has been bound
to the device for some number of days. The user hits the "hey, I wanna
be a super-fancy developer and do dangerous stuff" button. We set some
arbitrary delay on this, N hours:
- send out an email to the associated email address immediately, and
then randomly at some point in the next N/2 hours (the idea being not to
be predictable so if the attacker is able to use the email app to delete
the email they have to be at least somewhat competent rather than just
waiting exactly 2 hours).
- present a persistent notification in the tray "still want to root your
device?" for the duration of the N hours.
At the end of the time period the device gets unlocked and a persistent
note is made on the Firefox Account for the device.

The general idea is that you have to lose control of your device for an
extended period of time and our web services infrastructure can help
provide notifications via other channels if we have them. (In an ideal
world everyone has both a Firefox OS phone on them and a Firefox OS
tablet at home, right?)

Honestly, the bang/buck effort seems way off for this compared to "opt
in to developer mode at first-run, potentially having to wipe your
device." And until we provide more support for layered
security/encryption, in many cases there isn't much of a point since the
weak 4-digit pass-code is all that's standing between the attacker and
the user's email account(s)/etc.

Also, many interesting permutations of this potentially want the
processor/chipset to have a non-extractable private crypto key that can
be used to prove the device is who it says it is. Various things using
serial numbers/MACs/etc. are too predictable or just accessible to
would-be attackers on the back of the box or inside the battery case. I
think many interesting server-assisted mechanisms depend on a
non-forge-able device id where the initial owner of the device can
reliably bind the device to some other authentication factors. (So it
becomes "*initial* possession is nine tenths of the law" rather than
just "possession".)

Andrew
Paul Theriault
2014-09-10 07:48:12 UTC
Permalink
This seems like a good idea, but I think the approach may not go far enough. I have some suggestions.
1: Lost, locked device found by a nefarious person with no plans to return it
2: Device in the possession of a nefarious person who intends a persistent attack on the owner of the device.
1) In the forever-lost-to-evil case we want the attacker to not be able to get at the data, so wiping the data as a prerequisite to gaining root-ish access seems absolutely correct. And as Paul notes, it's quite conceivable for even a limited-capability attacker to keep the device out-of-touch via network, etc., making remote wipe insufficient on its own. Of course, since the data is not encrypted on the flash, we will lose to more capable attackers at this time.
2) In the root-and-return case we want the true owner of the device to be able to know that their device has been tampered with. The problem is that once a device has been rooted, the device can no longer be trusted to indicate that it's been tampered with at this time. Obviously if we do enough trusted-computing stuff we could get the boot process to indicate rooting (a Firefox with a robber's mask!), but I think we're pretty far from that right now.
Nuking the user's data is an excellent indicator of potential tampering. But that's not reliably being proposed here. The current proposal as I read it asks the user "want a lock screen code?" at first-run but the actual decision is "want a lock screen code and for it also to enable a super-powerful developer mode that could let someone persistently pwn your device?"
It seems like it would be better to just ask the user outright whether they'd like to enable super-dangerous debugging mode and how they'd like to secure that debugging mode. For example, I would personally prefer that the code that would let an attacker persistently root my device be more sophisticated than the 4-digit code I'm potentially typing in front of everyone all the time and smudging onto the screen with my fingerprints. Or, since people hate FTU screens, make it an opt-in under an "advanced..." or "developer..." call-out in the flow that will catch the eye of developers/tinkerers but not infuriate most users.
Both good points...
- send out an email to the associated email address immediately, and then randomly at some point in the next N/2 hours (the idea being not to be predictable so if the attacker is able to use the email app to delete the email they have to be at least somewhat competent rather than just waiting exactly 2 hours).
- present a persistent notification in the tray "still want to root your device?" for the duration of the N hours.
At the end of the time period the device gets unlocked and a persistent note is made on the Firefox Account for the device.
This is something I considered, and we already have Firefox Account signup in FTU (hence my comment in the google doc about this). But unless you do this on the _actual_ first run (or after a factory reset), you have to wipe the data, right? Otherwise its just the attacker setting up their own firefox account, not the device’s actual owner. Or am I missing something?

But I do like the idea of tying this dangerous functionality to a Firefox Account, instead of a pin (hopefully the Firefox Account is more secure). Not sure if developers will agree with having to have a Firefox Account though?
The general idea is that you have to lose control of your device for an extended period of time and our web services infrastructure can help provide notifications via other channels if we have them. (In an ideal world everyone has both a Firefox OS phone on them and a Firefox OS tablet at home, right?)
Honestly, the bang/buck effort seems way off for this compared to "opt in to developer mode at first-run, potentially having to wipe your device." And until we provide more support for layered security/encryption, in many cases there isn't much of a point since the weak 4-digit pass-code is all that's standing between the attacker and the user's email account(s)/etc.
Also, many interesting permutations of this potentially want the processor/chipset to have a non-extractable private crypto key that can be used to prove the device is who it says it is. Various things using serial numbers/MACs/etc. are too predictable or just accessible to would-be attackers on the back of the box or inside the battery case. I think many interesting server-assisted mechanisms depend on a non-forge-able device id where the initial owner of the device can reliably bind the device to some other authentication factors. (So it becomes "*initial* possession is nine tenths of the law" rather than just "possession”.)
Other options for user authentication I had been thinking about were:

- pairing the phone with the computer it is going to be plugged into - maybe via adb (maybe by use of 842747) or wifi (with upcoming wifi debugging)
- Ship phones with “developer NFC sticker” - basically an NFC tag which is proof of ownership (only works for NFC devices obviously)
- Pair the phone with a computer via bluetooth during FTU. Access to developer options later requires you to pair to the computer again
Andrew
_______________________________________________
dev-b2g mailing list
https://lists.mozilla.org/listinfo/dev-b2g
Paul Theriault
2014-09-10 07:57:12 UTC
Permalink
Post by Paul Theriault
- pairing the phone with the computer it is going to be plugged into - maybe via adb (maybe by use of 842747) or wifi (with upcoming wifi debugging)
- Ship phones with “developer NFC sticker” - basically an NFC tag which is proof of ownership (only works for NFC devices obviously)
- Pair the phone with a computer via bluetooth during FTU. Access to developer options later requires you to pair to the computer again
Which are all, on second thoughts, just overly complex ways of setting up a password to access developer options at first use...
Andrew Sutherland
2014-09-10 08:36:38 UTC
Permalink
Post by Paul Theriault
Post by Paul Theriault
- pairing the phone with the computer it is going to be plugged into
- maybe via adb (maybe by use of 842747) or wifi (with upcoming wifi
debugging)
- Ship phones with “developer NFC sticker” - basically an NFC tag
which is proof of ownership (only works for NFC devices obviously)
- Pair the phone with a computer via bluetooth during FTU. Access to
developer options later requires you to pair to the computer again
Which are all, on second thoughts, just overly complex ways of setting
up a password to access developer options at first use...
Well, passwords do suck, though. And so does manually needing to turn
os-debugging mode on/off or dealing with that frustrating 12-hour adb
kill-timer. All of the options you list above sound like great options
to help me make sure that my phone is only in development mode when I'm
at home at my dev machine but is safe out in the world.

Andrew
Stephanie Ouillon
2014-09-10 09:16:56 UTC
Permalink
Post by Paul Theriault
- pairing the phone with the computer it is going to be plugged into - maybe
via adb (maybe by use of 842747) or wifi (with upcoming wifi debugging)
- Ship phones with “developer NFC sticker” - basically an NFC tag which is
proof of ownership (only works for NFC devices obviously)
- Pair the phone with a computer via bluetooth during FTU. Access to
developer options later requires you to pair to the computer again
I don't see how 1) solves the case when a user has never paired his/her device with
a computer and the attacker does it. Or do you mean doing that during FTU?
3) solves this issue, but the drawback is that you don't necessarily have a computer
nearby when you first start your phone.

For 1) and 3), considering it solves the issue of an attacker being able
to pair first the stolen phone to his/her own computer, what happens if you
want to help somebody debug his/her phone, or use a device that you don't
necessarily own (I'm thinking about the context of a debugging session,
or workshop, or hackaton, or class...)?
Maybe an option you could set (while the phone is connected on the
legitimate paired computer) such as "enable pairing with one more device",
would solve that.
Paul Theriault
2014-09-10 10:39:41 UTC
Permalink
Post by Stephanie Ouillon
Post by Paul Theriault
- pairing the phone with the computer it is going to be plugged into - maybe
via adb (maybe by use of 842747) or wifi (with upcoming wifi debugging)
- Ship phones with “developer NFC sticker” - basically an NFC tag which is
proof of ownership (only works for NFC devices obviously)
- Pair the phone with a computer via bluetooth during FTU. Access to
developer options later requires you to pair to the computer again
I don't see how 1) solves the case when a user has never paired his/her device with
a computer and the attacker does it. Or do you mean doing that during FTU?
Yeh I meant all during FTU as an alternative to pin code/password.
Post by Stephanie Ouillon
3) solves this issue, but the drawback is that you don't necessarily have a computer
nearby when you first start your phone.
For 1) and 3), considering it solves the issue of an attacker being able
to pair first the stolen phone to his/her own computer, what happens if you
want to help somebody debug his/her phone, or use a device that you don't
necessarily own (I'm thinking about the context of a debugging session,
or workshop, or hackaton, or class...)?
Maybe an option you could set (while the phone is connected on the
legitimate paired computer) such as "enable pairing with one more device",
would solve that.
_______________________________________________
dev-b2g mailing list
https://lists.mozilla.org/listinfo/dev-b2g
Kartikaya Gupta
2014-09-10 14:30:06 UTC
Permalink
Post by Paul Theriault
- pairing the phone with the computer it is going to be plugged into - maybe via adb (maybe by use of 842747) or wifi (with upcoming wifi debugging)
- Ship phones with “developer NFC sticker” - basically an NFC tag which is proof of ownership (only works for NFC devices obviously)
- Pair the phone with a computer via bluetooth during FTU. Access to developer options later requires you to pair to the computer again
Of these options I dislike 1 and 3 because (a) you may not have a
computer handy at the time and (b) you may have a different computer
later when you actually want to enable the os-developer mode. I like
option (2) more. In the case of non-NFC devices (or even for NFC
devices) you could just ship phones with a separate unique PIN code to
activate developer mode. If the users lose this they can recover it by
calling their carrier and authenticating themselves via the regular
carrier authentication channel. Users who buy the phone without a
carrier plan are likely to be developers or savvy enough to realize they
should hang on to the code.

kats
Loading...