Hacker News new | past | comments | ask | show | jobs | submit | PopAlongKid's comments login

>Our parents and elderly relatives didn't grow up with smartphones.

Neither did about half of the millennials, so why don't they need similar help?

I don't think the qualifier is age, rather it is prior computer experience. I am elderly, and only started using a smartphone (as opposed to feature phone) about four years ago, but I have had my hands on computer keyboards for over 50 years, so learning to use all the basic features of a smartphone didn't require any help.

Likewise, many millennials did grow up using computers. If you already understand basics on a PC like bootup, shutdown, login, system settings, installing a program, starting a program, finding a program, copy/paste, upload/download, the smartphone should not present much of a challenge. Otherwise, learning a smartphone is mostly just learning how to use a computer.


"Most importantly to election integrity advocates, [Registrar of Voters] Dupuis said he will release the records in a JSON file, which is a text-based format used to easily analyze large data sets. A Secretary of State’s memo had previously directed the state’s registrars to only release the records in a file type [PDF] that would have made election audits difficult, if not impossible, according to election integrity advocates like Steve Hill, an author at the national nonprofit FairVote."


>It's a straightforward calculation that needs accuracy.

If only that is all it was. It's not calculations and linking entries on one form to another that is the problem, it is interpreting the law. Suppose there is an allowed deduction for a certain type of expense. But the law doesn't provide a precise, actionable definition, it relies instead on "facts and circumstances".

Did you know, for example, that in U.S. income tax law there is no universal definition of a "trade or business", yet that distinction has a huge impact on how certain items of income are treated. So what do you do if you think you have a trade/business, but can't be certain given the information at your disposal. IRL, such disputes often must end up in a court of law to be decided.


I question whether it is the prosecutor's primary job to reduce crime. Instead I think it is their job to get justice under the law for crimes committed.

I was well along in age before becoming aware that prosecutors have a far greater impact on our justice system than courts and juries, via plea deals, adding or dropping various charges, letting statutes of limitations run, and so on.


>Incredibly, its mice will still charge dead-cockroach-style, flipped on their back.

I don't understand this comment in the article. What is incredible about that? I mean, I could turn my notebook or smartphone over and it would still charge the same, what's the big deal?

>This situation is worsened by the fact that many manufacturers now ship devices without a charging brick.

Which ones (so I can avoid them)? I just purchased a Dell notebook earlier this year and it came with a brick.


The Apple mouse's charging port is on the underside, so you can't use the mouse at all while charging.


Thank you for explaining the issue. I guess there are Apple fans (including the author of the article) who can't conceive of a world where there are people who have never used an Apple mouse.


Most people who use an Apple mouse don't really find it to be "an issue". If by chance you ignore the days or weeks of reminders about charging, plugging in for 1-2 minutes will give you charge to last an entire day.

The people who complain about it are almost exclusively people who don't actually use it, but love to complain about it.


When the mouse bricked itself, I was forced to switch to the trackpad. Maybe I could plug the mouse in, if I happened to bring my only cable from home (or maybe from the office) that day. But there was no way of knowing when the mouse was usable again short of interrupting myself to test it. Over time, I decided the mouse wasn't worth the mental overhead and switched full-time to the trackpad, which always works.

Plugging in a laptop is a nuisance, but at least I can do that every time I sit down to start working and it will manage itself all day.


I have a rechargeable, wireless mouse. It's not fruit-branded.

The thing about my mouse is that when the battery gets low, I can just connect it to my laptop with the charging cable and continue working. It's much more convenient than quitting for the day, and cheaper than buying a second mouse.


charging the magic mouse for two minutes will give you weeks of battery life


The point is that a peripheral device should not be unusable while it’s charging, whether that charge takes two seconds, two minutes, or two hours.

I can’t think of any other recent electronics device that cannot be used while charging.


I can't use my watch while I charge it. Nor my earbuds. Nor my portable Bluetooth speaker.

None of those can give me any meaningful amount of use time (and definitely not enough for a full work day) from a 2 minute charge.

My mouse can. The port location is not a problem. The constant whining about it from people who largely don't use it is more annoying than any disruption a charge cycle causes.


It's basic logic why why watches and earbuds generaly don't charge while in use. So bad example there.

If my Logitech G502 Lightspeed mouse couldn't charge while in use, I wouldn't even consider buying it.

Because when I pay for premium service, I expect premium delivery.

Anything else is fanboyism, handwaving and/or stockholm syndrome.

Specially because previous generation of apple's mice could be charged while in use.

I don't know why you're so upset on defending a downgrade with zero tradeoffs, only to end up with an inferior product.


> If my Logitech G502 mouse couldn't charge while in use, I wouldn't even consider buying it.

Similarly, if a mouse doesn't support touch gestures I wouldn't consider buying it.

I guess we both make compromises: you never have gesture support, I maybe can't use the mouse for two minutes while I go make a coffee if I forget to charge it overnight once a month.

> Specially because previous generation of apple's mice could be charged while in use.

The first version of the Magic Mouse required you to change AA batteries, and its predecessor was a wired USB mouse. Not really sure how you think either of those is charging while you use it?


Gestures suck for me.

I prefer many physical buttons for reliable macros :)

But when it comes to charging, there's no reason to just happily accept and defend an inferior solution for no tradeoff whatsoever.


Here's a hint: the thing you clearly think is "inferior"... I don't even think about at all.

It's literally a non issue that gets paraded out by people who don't use it at every opportunity.


It is inferior :) Sorry

When my mouse starts blinking I just plug cable and continue whatever I'm doing and it's a solved problem. No second guessing if I can take a break now in the middle of a meeting, no unnecessary cognitive load, no need to remember to charge it when "I have the time", it's... magical.

It just works.


Personally I don't find it to be a particularly huge "cognitive load" to see a "mouse battery is low, charge soon" notification and then just charge it at the end of the day.

But for those that do, it's great that we have choices.


Yes, but the other point is that, in a practical sense, it is not really a problem. Just a silly design decision.

I expect, if they ever get around to redesigning that mouse, they’ll change the port, if only to avoid derision. The recent port change was not enough to warrant moving the port. That front edge is quite thin so there may not be enough room in the current case.


>The point is that a peripheral device should not be unusable while it’s charging

why?


unnecessary limitation

if it was a cheap chinese tier 3 mice that would still be debatable

let alone overpriced hardware


My Google Pixel 7 came with a cable but without a charger. I think that's the standard now, if not the law in some places.


I should have been clear that I was referring to notebook computers coming without a brick.


>The most important aspects of all this boil down to people being able to do stuff other people would prefer they not do.

Can you elaborate please? I don't get your point.


Yeah, I was moving fast on mobile and really could have done more.

People able to do stuff -->(other people would prefer they not do.)

Hacking of all kinds -->(software, hardware, processes, nature, the OSS mindset brings with it some perception of what could be done as well as might need to be done.)

Reverse Engineer Software -->(OSS tools often have options and workflows unavailable or that are very expensive.)

Archive -->(Some arcade games running as live distributions comes to mind)

Run Software on device or in environment not authorized (piracy, run in virtual machine, on OS not intended by developer.)

Build software -->(that may be sanctioned, illegal, or otherwise controlled.)

Repurpose hardware -->(turn router into media server, restore lost features, make hardware do extras...)

Repair -->(using the software freedoms often means repair is possible even when it is not intended to be)

Write illegal software -->(It remains possible to drop code on the net anonymously.)

Encryption-->(either novel methods or those deemed a hazard.)

That's a pretty solid list of things people may be inclined to do that other people would prefer they not do.


>The year was 1986, pre-spreadsheets.

There were millions of spreadsheet users by 1986, as VisiCalc was released in 1979[0] and similar programs like SuperCalc[1] were also in use. They were both ported to IBM PC and saw significant use in the corporate world prior to and including 1986.

[0]https://en.wikipedia.org/wiki/VisiCalc

[1]https://en.wikipedia.org/wiki/SuperCalc


>Say the tax brackets are rewritten for 2025, starting “January 1”

U.S. income taxes are calculated on an annual basis, not hourly, so that is not an issue. (Wages are taxed according to when they are paid, which is a specific point in time, not according to when they were earned). A better example is trying to figure which calendar (tax) year an item of income or deduction belongs to. On tax professional forums, there are occasional discussions about what happens if I make a business payment online just before New Year's Day begins, but the recipient doesn't "constructively receive" it until after (or similar scenarios involving time zone differences). Do I get a deduction for the old year, but they have income in the new year? (The best answer, of course, is to not wait until a few hours before a deadline to conduct business transactions of this type, but not every type of business has that choice available).


> If it's all just an automated and finite offset, there's no reason for daylight savings policies to hew to 60 minutes adjustments.

For as long as clock sync for electronic devices has been common, I have suggested to anyone who would listen that we should adjust forward 10 minutes on the first Sunday of each month for six months, and then back 10 minutes on the first Sunday of the other six months. A ten minute change once a month is not only easier to adjust to (almost unoticeable), but if you miss it, it's not as big a deal as being off by an hour would be.


Moving up and down at a linear rate would result in a saw-tooth-like wave, but the length of days change in a sine wave. Why not have the clocks sync themselves to sunrise time based on their timezone and latitude? I don't think this would be much different, in practice, than changing times at a linear rate.


This would lead to a monumental amount of confusion.

The primary time keeping device in my house is the clock on my stove; I wear old school watches a lot; and most of my cars are old and have old clocks in them. I can't be the only one, so multiply this by at least a few million other people in America alone.

Sure, you can tell everyone they need to ditch dumb clocks and replace them with internet-enabled smart clocks. But I think that's a far more onerous undertaking than just dealing with the fact that solar time and clock time are mismatched.


More than one car per individual seems already such a waste but multiplying this at million people level, is that a real thing in USA or are you just extrapolating in your own social bubble? I mean, I must admit that I had the thought that this was just a troll that leverage on gross exaggeration of some American stereotypes, but actually I just can't decide if you are plain serious about these statements.


I have two cars (just for me, not family). Paid about $6k in total for them. You can have multiple cars without being in any particular social class


Adding to my sibling comment, time is also mostly used as a coordination system. Being offset by a few minutes would make aligning meetings with your remote coworker an even bigger nightmare than it is now.


Why don't we take it to the extreme then? Just make the time the Sun rises 6am always. And make seconds longer or shorter to adjust.


The Wikipedia article does a poor job of explaining that it is not "step up basis", it is "basis adjustment to fair market value", which may be up or down. They do have a paragraph further down about "stepped down basis", but it still doesn't make the point clear.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: