I don't know what happened and I fully admit that, but let's take the cop section out of it, so none of this "Cops lie all the time" business.
1) A person was shot by someone at the scene
2) A different person who was in conflict with the first person had a gun with the same ballistic profile (caliber etc.)
3) There are two separate parties with an interest in the outcome who witnessed the crime
4) The first party with an interest, was actively aggrivating people at the scene, causing hightened tensions
5) The second party was in direct conflict with the first
I have a hard time believing the first party shot their own in order to press charges on the second party. It doesn't pass the Occam's razor test, TBH
It doesn't have to pass Occam's razor. It has to pass (or would have had to, had they not killed him) the standard of reasonable doubt.
Edit: Sorry, this puts words in your mouth. I mean to say that I don’t think Occam’s Razor is a sufficient level of belief in these kinds of situations.
Why aren't people paid a full time salary for the work they're expected to do instead of the time they sit at a desk? It's a question I genuinely struggle with.
If the employee is doing all that's expected of them, then the "contract for owning someone for a period of time" (not my words - and relatively creepy) should be fulfilled. It doesn't seem a period of time that is expected but a force of work delivered.
I am willing to be wrong in this assumption, however, given the right argument.
> Why aren't people paid a full time salary for the work they're expected to do instead of the time they sit at a desk?
If the work you are expected to do is fixed--your boss can't come to you with an ad hoc assignment that's not in some formal job description--then you're not an employee, you're an independent contractor. (And that means the corporation no longer has to worry about things like your health care and retirement benefits.)
The real value proposition for having an employee vs. an independent contractor is that the work the employee is expected to do is not fixed--your boss can come to you as an employee with an ad hoc assignment that's not in any formal job description and expect you to do it. In other words, the belief the article describes, that paying someone a full time salary as an employee means the company owns their time, is actually mistaken (even though I think it is very common among corporate managers and executives). What you're actually paying them for is being able to give them ad hoc new jobs to do as the company's environment and needs change, and not have to go through the hassle of writing up a new contractor's statement of work and negotiating a new price. The only justification you need to give an employee a new ad hoc assignment is "this is going to help the company".
Exactly - full-time is about paying someone for availability more than hours of focused labor. It just so happens that being available remotely looks way different (and is generally more leisurely) than in-office availability.
Having slack is key to being an effective business. And slack is literally people not being at max utilization 40 hours a week.
Moreover: in the high skilled tech teams we’re mostly talking about, the expectation is that you as a team member are helping to identify and drive the work itself. Reporting issues, looking for opportunities, talking to users and partners, etc. That requires soft skills, relationships, and active engagement. The idea that you’re hired just to do the obvious things which are easier from home (churn out code, not much else) seems strange to me for most modern tech workers.
The real value prop of having a full time employee, is you own their full time focus five days a week. It's not about practicing a trained skillset, or providing expertise ONLY, it's also about owning part of the business, managing other employees, and deep understanding of the product and how to market and deliver it to the user.
I'm not saying that's good or bad, however it is definitely not clearly stated-- not in schools, universities or in job descriptions. Society tells your for decades you're supposed to specialize and then get paid for that skill, yet most companies are expecting so much more.
That is the kind of "ownership" that the article is describing, but I don't think it's actually true. Even employees sitting in a office aren't focusing full time on the business for every single minute they are in the office.
That's why I phrased it instead in terms of assignments and how specific the definition of the job is, as compared with an independent contractor. You might not be focused on the business every single minute of your nominal working hours--but your boss can come to you in any one of those minutes and basically redefine what your job is. That's what the company "owns" if you are an employee. And that can happen just as easily if you're remote.
> it's also about owning part of the business
But employees, unless they are also stockholders, don't own any part of the business. And even in cases where employees do own stock on paper, their ownership share is so tiny that it arguably does not provide any meaningful motivation.
Yeah I agree with that. It's about owning your "main" focus. Being late one day and having the excuse that you had to do another job would not be ok, yet you can give many other typical excuses regarding family or dealing with life in general.
> But employees, unless they are also stockholders, don't own any part of the business. And even in cases where employees do own stock on paper, their ownership share is so tiny that it arguably does not provide any meaningful motivation.
Right, even though you don't own the business, it seems the expectation is still to behave in a way that you do. That is at least rewarded, if not punished by not promoting and/or having a negative review because you don't have the "soft skills".
That model is not all sunshine and roses either. Wages were a concession sought by labor in the Industrial Revolution because piecework was considered to be worse. Gig work revived all those old problems for a new Gilded Age.
All these problems are coming about because software doesn't fit nicely into any of the traditional arrangements. We really fit in more as consultants external to the business, but the software consulting industry as it exists today is corrupted by the perverse incentives of generating billable hours and essentially scamming clients.
I'd like to see the emergence of software partnerships modeled after law firms, where engineers are owners and not employees. The problem is, tech is currently dominated by criminal monopolists who regularly collude to drive wages down in violation of numerous laws. Musk and his PayPal mafia buddies are running the same racket right now, that Apple and Google were convicted of 10 years ago. They would simply refuse to work with any tech worker co-ops, because they have monopsony power over our labor, and they can just lobby Congress for more H1-Bs.
> Why aren't people paid a full time salary for the work they're expected to do instead of the time they sit at a desk? It's a question I genuinely struggle with.
Turns out just being present is a substantial part of what many (most?) employees are paid for. Participation and quality of contributions are optimizations once that bare minimum is attained...
What do you think school is conditioning everyone to do anyways? Get up early and show up, follow instructions, every. day.
That doesn't sound very humane. When you employ someone FTE (vs contract), you employ the whole person, not a machine in a factory. People have good days, people have bad days. Some tasks get unexpectedly complex. Sometimes you need to switch to other tasks. Paying someone for the time lets you average things out, provides psychological security, and lets you share risks.
> If the employee is doing all that's expected of them,
It's all about those expectations. It seems having a trained skillset is never enough. It's not about practicing what you are an expert in, but rather increasing business value at any costs, even if it means doing a lot more than what you were trained to do. These expectations are typically more business skills, management skills, and even marketing skills. This is why going to the office is seen as a must for many "leaders" of companies.
You're not just a programmer that contributes code, you are also a business person who ones part of the company, a manager who has to lead teams of people, and a marketing person that needs to think what the end user wants and how to deliver it. When there are so many cross-disciplinary expectations (which are rarely stated clearly), it's no surprise they want people in one place in order to have them coordinate through this complex web of roles.
> Why aren't people paid a full time salary for the work they're expected to do instead of the time they sit at a desk? It's a question I genuinely struggle with.
Because it's hard to quantify what's expected of someone.
It's much easier to just count the time they're working, so that's what we're doing.
If that's the case I'd expect performance reviews to be limited to attendance. Since they're not, it would suggest we do indeed have ways to quantify our expectations for a role.
I agree, but sooner or later the time component comes in. If you're being paid piecemeal for work, a lot of dynamics emerge after a while including reduced perunit pay and increased expectation of throughout. In other words what happens when deliverables expected take 70 hrs of your time a week?
Speaking as huge wfh proponent as the project I'm managing has benefited from it tremendously (we had one primary and a few satellite offices before covid. Satellite offices were effectively second tier citizens. Wfh enabled everybody to be on the same page and tier. We are way more effective and I dread the return to primary office to exclusion of satellite ones. Everybody else's milleage may vary).
Well, if person B being able to handle a given workload in 30 hours is roughly the norm, and someone else consistently takes 60, you probably have a performance problem to deal with. (And, yes, a lot of this stuff is hard to specify and measure.)
We expect to purchase X hours of your time in exchange for Y dollars.
If you can get more done in that time, you’re worth more.
If you can get less done, you’re worth less.
If you try to do the minimum “force of work” required, you’re worth less, and likely to be replaced by someone who will maximize the value produced in the amount of time that has been paid for.
It's a matter of priorities. If we decide these things are important as a nation they don't have to be scarce. We make the best war machines on the planet and spend a lot of money on them. That is more of a priority than the good health of our nation.
We spend 5-8x more on healthcare than defense. You have to ration somehow. If everyone use as much of the healthcare system as we wanted it would consume our society.
In the US, we ration by price… “the doctors expensive and I don’t feel too bad so I’m not going to go”. In Europe, they ration via lines… you’ll get your free healthcare eventually.
Either way there are problems, but seems like other countries might be onto something
We pay a lot for healthcare, but that money is not spent on healthcare.
It is an important distinction. For example, about 33% of healthcare dollars go to paying “claims proceesing” people at your insurance company and your doctors office to haggle with each other and produce duplicate paperwork.
If the low end of your estimate (5x) is the correct multiplier, the money that goes to claims processing would be enough to pay for universal healthcare in pretty much any other first world country.
Other things, like absurdly high drug prices, also are not healthcare spending. 90+% of drug discovery research money is spent at universities, and not by pharmaceutical companies. Also, those companies pay more for prescription drug advertising than for drug research.
"90+% of drug discovery research money is spent at universities"
Afaik this is simply not true, and even if it was, this would be misleading if you consider that most costs are made during clinical trials, which test the effectiveness of the drug.
> It is an important distinction. For example, about 33% of healthcare dollars go to paying “claims proceesing” people at your insurance company and your doctors office to haggle with each other and produce duplicate paperwork.
This is not true, but there is a big non patient healthcare services culprit in US healthcare costs, and that is legal liability. In the US, every entity in the healthcare chain is doing so much extra to prevent themselves from being liable, and charging extra in case they are found liable, that it inflates all costs dramatically.
Without tort reform, I doubt we see much improvement.
Tort reform is a red herring here in that most insurance companies do things that they deserve to be sued over as a matter of routine, and they like to talk about ways to statutorily limit your right to sue them like having a system of complaints with a commissioner who has no time to process complaints. I think healthcare costs are a result of lots of different insane things that snowball rather than something simple like too many lawsuits.
Here in TX they put caps on awards for medical malpractice. As far as I can tell it hasn't helped costs all that much. Also if my Dr. leaves me paralyzed I can only get 500k damages. Malpractice law is a problem, but I don't think it's the one to focus on.
Your numbers are way off. The Affordable Care Act capped the insurance company's share at 20% (minimum 80% medical loss ratio). In practice most commercial payers are taking less than that. Certainly nowhere near 33%.
I'm not sure what point you're trying to make about drug discovery research. The major expense in bringing a new drug to market is not in the basic research but in the phase-3 clinical trials. Those often cost pharmaceutical companies upwards of $1B now.
How much do pharmaceutical companies make off of that $1b investment though? The scale of the investment only matters if they don't have the capital to cover.
It’s a common argument that somehow people who live in countries with good health coverage have to wait a very long time for health services. Things that are not urgent, this can be true. And you can certainly hear anecdotes here and there about more serious problems. But most of the time for the vast majority of people in the vast majority of situations, healthcare is delivered when it’s needed, on time, without dangerous delay. This talking point that somehow elevates the American approach because it is faster is misguided.
A war machine can be built once and maintained for a long time. Medical care is a personal and time consuming process with multiple parties and tests that need to be coordinated across. All of this is expensive because for every person being given that time of day, there's someone else that needs it. There aren't enough hours in the day and smart people in the country to scale up healthcare.
Contrary to what we would like to believe, there is a scarcity of intelligent people willing to devote their entire lives to the study of medicine across specialties.
If you cannot afford it, the US healthcare is terrible. If you have the means to afford it, the US healthcare system is one of if not the best in the world.
A big part of why we spend more is cultural and environmental. The government incentives encourage unhealthy food to be everywhere and subsidizes things like adding corn syrup products to everything. Policy also encourages driving instead of walking. And we've made it culturally acceptable to be fat. There are big movements trying to convince people their weight is not a problem, not something they have control over, and you can be "healthy at any size". Many doctors have stopped warning patients about their obesity because of social pressure not to fat shame! I started with a doctor like that when facing blood pressure and related issues and have since ditched them. Caring more about sensitivity, being accused of fat shaming, and expediency than my health is malpractice in my book.
I’m not at all convinced they look carefully at health outcome and cost efficiency metrics before deciding where to point their private jets. Instead, I’ll bet (in part) they use high cost as a surrogate for quality. And the immediate availability of any test or consultation they desire.
It depends on which metric you look at. The USA is at or near the top in 5-year survival rates for most types of cancer. We also have unrivalled trauma care. In other metrics we're well many behind other developed countries.
Many of our worse outcomes though have nothing to do with the healthcare system. The decrease in life expectancy is being driven by factors like obesity, substance abuse, sedentary lifestyles, vehicle crashes, suicide, and violence.
survival rates are a pretty bad metric because you can easily change them by changing the amount of screening without making people live longer or healthier. when you compare cancer mortality rates, the US is not doing well.
Nope. Certain types of cancer screening are helpful in making people live longer and healthier. It is much easier to treat cancer when it is caught early.
Absolutely, but 5 survival rates are still a really bad metric because they make things look better even for uncurable cancer (or cancer that was curable but treatment was given up on because of price). Also, if a 90 year old gets a slow growing cancer that doesn't require treatment (cause they'll be dead before it's a problem), screening for it will increase survival rates even though you didn't actually treat anything.
The drivers of poor US life expectancy is mostly guns and cars. Cars are the #1 killer of Americans between 5 and 45 (approximately). That doesn’t have much to do with the medical system.
You missed the point. Cancer and heart disease primary kill people when they're already old. Treatment of those diseases, while important and necessary, doesn't impact average life expectancy much either way. Whereas fentanyl overdoses are now the leading cause of death for adults 18 - 45. That has a huge impact on average life expectancy because those people would have otherwise lived many more years.
Your information is outdated. The rate of accidental death among younger adults due to opioid poisonings has increased significantly in just the last few years.
My primary assertion was that life expectancy does not inform you about the quality of the US medical system. I focused on guns and cars, but adding opioids (and even lifestyle issues like obesity) doesn’t change my argument or the conclusion.
Someone dying a few years earlier than otherwise has a smaller impact on _life expectancy_ than someone dying young. Furthermore, the #1 cause is correlated with lifestyle factors which medical treatment has little control over.
> A war machine can be built once and maintained for a long time.
This might be true in theory but is somewhat irrelevant here given the gargantuan levels of waste in US military spending.
There is also gargantuan waste in healthcare costs, but those are spread out via insurance, not taxes.
The presence of gargantuan waste in both sectors and the different avenues of spending both make any after-the-fact, simplified explanation of why one costs more than the other kind of moot.
The presence of gargantuan waste should also be addressed before implying that the costs of healthcare are all presumed necessary.
> Contrary to what we would like to believe, there is a scarcity of intelligent people willing to devote their entire lives to the study of medicine across specialties
Why? Are money and social status insufficient motivators? Or is there another reason?
There is no country on the planet that doesn't have a scarcity of medical resources. In fact there is an entire set of academics devoted to the appropriate utilization of medical resources - healthcare economics.
It can't be too much of a surprise that a lot of white collar jobs involve a fair amount of BS. I have 3 to 4 meetings a day and only 1 or 2 are really necessary =[
What do you see as your best routes to monetization on the Apple v. Android and how FTP differs on each platform?
Also, what routes of promotion have you been taking to build a customer base?
Sorry to ask so many questions, but as a maker of personal projects which are games, I'm curious about the promotion and monetization of mobile games =)
I haven't yet tried an ad-supported game (with an IAP to remove ads), but I think those perform the best overall. Personally I'm opposed to ads for a number of reasons (privacy, bandwidth usage, ad quality), so I'm not sure I'll ever try one of those.
I was lucky to get my first game, Downwordly, as Game of the Day on iOS. This plus other sustained featuring (Essential Word Games) led to 50k+ downloads. Unfortunately I did a bad job with the IAP value proposition (both explaining it and what it actually is), so the conversion rate is abysmal (<1%). Pine Tar Poker was also Game of the Day a few weeks back and had some sustained featuring (Best New Games). That let me ship a few hundred copies over a few days but once the featuring dried up, the sales did too.
Well Word is currently featured under Best New Games and is getting decent download numbers in the mid 100s and has a great conversion rate of ~15%. I'm hopeful that it can continue to spread through word of mouth and the built-in score sharing.
I have made next to nothing on Android. I've never (that I know of) had any featuring and the Play Store doesn't have as many editorial pushes in my experience. To be honest, the download count and revenue from Android is such that I only release games there because it takes less than an hour or so (thanks Unity) and my brother uses Android.
I'm currently working on an expanded version of Pine Tar Poker for console + Steam so I'm interested to see what I can learn in that space! Let me know if you have any other questions. I hope hearing about my experience so far was helpful.
This! I have resisted the Apple platform due to personal tastes but if they get the blood sugar monitor working (which they say they're close to), I'll switch my entire platmorm ecosystem over to Apple for that one specific feature.
LOL, according to this, my family's group text thread is illegal under this law, as DMs can only be between 2 people.
EDIT: serious question; would Slack be professional networking or, how does that work under this bill? It's DMing for business, but it isn't really networking.
> (I) shared between the sender and the recipient;
171 (II) only visible to the sender and the recipient; and
172 (III) are not posted publicly;
maybe sender and recipient can be considered a group chat with multiple recipients? Otherwise you wouldn't classify something like Kik or Discord as social media.
Group chats wouldn't be illegal, that's not an online service where you sign up and make a profile etc. This applies only to sites that would serve you random content from random users... i.e. Social Media.