I don't know what you are referring too, maybe it's aliexpress level and they skimped on wire thickness. Most of the cost is the pure copper cable here, which should indeed cost 100+ for a 16ft cable, if you don't want it to melt when it's hot outside.
There are dozens and dozens of J1772 chargers on Amazon with varying price points, and you can easily find 30A capable chargers that offers a 20-30 foot cable for $150.
I don't know if they're skimping on the wire thickness, but frankly I doubt it. Tesla can offer their UMC for $250, and that's a branded product from a car manufacturer and is also capable of handling 32A charging all day.
Aluminum has some interesting properties and if you don't know how to mitigate them can be trouble. I know they can be mitigated, but I also know I'm not an expert in all of them so I won't comment too much. So yes, but only if you are careful in ways most people wouldn't think to be careful.
I am LMFAO off because AVR-8 is a common choice for a microcontroller for a gas pump as it can handle the buttons and pumps as well as the magnetic card reader, a serial link for credit card transactions, and the ability to drive a simple LCD controller.
We were creating many complicated yet great user experience frontends before React was even born, using server side template engines.
React makes development at scale of these things easier, and makes a website really a website (clicks are just triggering javascript instead of the classic post/get cycle that fetches new html). But it is completely unnecessary to build any level of GUI
good performance and putting in extra effort should be rewarded over those who just do their job at the minimum effort levels.
That's a fundamental problem with teacher pay, there are no bonusses, there are hardly incentives or room for raises. It's all just tenure based. Once you're in, you're in.
Does standardized test score outcomes measure this behavior or does it favor teachers who teach test taking and have less challenging demographics in their classroom?
Performance based pay in professions whose performance is difficult to measure directly leads to bizarre outcomes. I’m not disagreeing in general but I’ve seen this again and again in my career, and I spent a long time on Wall Street where bonuses -really- matter. That extreme brought out the extremes in how incentive pay distorts behaviors in unexpected and undesirable ways, which gets worse the further you get from a directly measurable outcome like PNL.
As a parallel example, hospitals who specialize in extremely difficult diseases with high fatality rates generally have abysmal patient outcome metrics over hospitals that punt anything complex to a specialty hospital. This plays out in policy spaces punishing the speciality hospitals despite the fact they are well known to be top of the industry in terms of performance and “extra effort.” Nominally though they should be shut down by all metrics.
This absolutely plays out in education. Not every classroom or school or district is equivalent in terms of the challenges they deal with. A teacher with a very challenged class who is a high performer and puts in extra efforts will be punished simply because their baseline was much lower than a teacher who punches the clock in a class of affluent students who have private tutors.
Presuming my understanding of persistent sessions lines up with yours, set `terminal.integrated.enablePersistentSettings`. You may also want to change the value of `terminal.integrated.persistentSessionScrollback` to something higher than the default (1000? not sure)
It does for me when using a remote instance, like a devcontainer or over ssh. Maybe that is just because the server side keeps running and when you reconnect the UI reloads from there. Locally nothing would remain to be reloaded when you restart VSCode.
Probably some dark pattern where they hide the actual price for just youtube music and show you the subscription for a bundle of services you don't want...
An opt-out doesn't solve the problem, because one person in the chain of custody if an excel file who hasn't opted out would be enough to destroy the data.
Only if the option was implemented as a user preference which isn’t stored in the file. The right way to do this would be to add the flag as a file header, and give people a UI control to set it by default. That way you’d always see spreadsheets as the person who gave it to you did, and you would create new ones with your preferred behavior.
Channeling my inner QA person, this breaks in many subtle and not so subtle ways:
- people love to export as CSV. And I kinda understand them, csv looks simple enough, what could go wrong ?
- other spreadsheet softs need to recognize the same field and act the same way. Google Sheets, LibreOffice, KingSoft's one etc. Not counting old office versions that aren't updated anymore, because the owner doesn't want to move into the cloud.
- the flag needs to stay when moving sheets around, copy to another set, etc.
That could still be better than nothing. But could still lead to catastrophic loses as people get used to it working most of the time, there's no perfect solution.
It might be easier to think about as the difference between perfection and practical improvements. Right now 100% of Excel users are at risk of silent data corruption. The proposed change would allow people to reliably prevent this category of corruption for all of the documents they create and over time that’d become a majority in their field.
This would also offer the path for changing the default over time to safe by default. There will be more spreadsheets created in the future than exist now.
> Right now 100% of Excel users are at risk of silent data corruption.
They already added an option in the settings last year to deal with this specific issue. Their solution isn't perfect either, they made the tradeoff to optimize for up to date excel users.
Still the same problems. Unless you exclude people with older excel versions they can still ruin the file. And if the first person in the chain didn't make the change, who would even notice it? So we're back at requiring extreme vigilance without any assurance.
Also keep in mind that the first person doesn't necessarily use excel at all. The file might start and end as a text file or TSV.
I don't understand what's wrong with the existing solution of:
Right Click > Format Cells > Text.
Set the data type of the column and Excel will not fiddle with it. Let it stay implicit and Excel will to guess. This change is saved in the file. The article only slightly touches on this: converting to CSV and losing all the Excel formatting and then opening it in Excel again.
As usual, I think it's a lot more nuanced than that.
An option was added last year [0] the way you propose it, but until then users already had the option to mark the columns as text. The same way we have to deal with any data where auto detection might be problematic: explicitely set a type, and potentially a format.
But of course that's bothersome, sometime people forget to do it, sometimes they don't even know how to properly do it etc. And some of these people will still probably forget to set the option, and there are still edge cases where it can fail (macros are cited)
> but until then users already had the option to mark the columns as text
This only works on Excel files though, which save a type for columns/cells. With CSV files, Excel would just auto-format them. I guess you could save everything as xls/xlsx, but I shouldn't have to use a vendors file format because they're threatening to corrupt my data if I don't.
CSV doesn't do what you want and excel doesn't support CSV files. Why insist on a completely wrong tool and file format combination?
If the excel developers cared about CSV support they would have added more settings to the CSV import tool that let you specify everything. Every single data base tool that lets you import CSV files has worked better than Excel. It simply means Excel wasn't built to import CSV files. Tough luck.
.xlsx file extension is well supported by many open source libraries. If you insist on working with Excel then you should use the appropriate file format, which is .xlsx. I do that all the time and it just works.
That's just CSV being unusable shit. There's always a schema for the types of the fields, but there's no way to communicate that schema, so CSV is inherently ambiguous. If your data includes fancy things like numbers you should avoid CSV.
It's still a shame that Excel's JSON support is so good, but no business user trusts it because "you can't double click the files" and CSV remains the worst best thing you can double click.
I don't want Excel to take over the file association for .json but I do sometimes wish Excel would just invent a file extension like .xljson that loads JSON so that at least you could trivially rename a JSON file to .xljson and get working double-click for your business users but a file format (with optional schema support even) that's a more modern and saner de facto standard than CSV.
I think that one dumb trick would improve everyone's productivity a great deal.
No, I'm talking about developing for those users. Right now if you want to write the simplest, dependency lightest export code that those users can double-click and open in Excel you are writing yet another CSV export. It would be nice if instead you could take an existing API that returns JSON and wrap it in a mimetype like application/json+excel, and add a Content-Disposition: attachment; filename=thisapi.xljson and get Excel double-click support without another flaky CSV file.
I think it's more than that. Big companies like this don't exist to provide solutions in the most efficient manner possible, nor do they (contrary to popular opinion) exist to create the most "shareholder value". Their real reason for existence is to provide employment. (True, the investors really only care about RoI, and would prefer more efficiency with a lower headcount, but the reality is that you need to hire people to actually build things and create solutions to sell, and for those people, including the top executives, the company's mission is not their priority, only their own personal career and bank account.)
So with a company full of people whose #1 concern is their paycheck and their resume, they'll make choices to optimize those, which means turning a trivial 20-minute shell script into a 3-month project requiring a whole team of devs.
They are "on both lists". The DMA is about platform gatekeeping, and the EU did designate Microsoft a gatekeeper with Windows.
Microsoft's DMA changes include allowing Edge to be uninstalled, interoperability in Search and Widgets, asking users for consent before syncing content through the connected Microsoft account, and more strictly respecting browser defaults.
the memory is inherent to the gpu architecture. You cannot just add VRAM and expect no other bottlenecks to pop up. Yes they can reduce the VRAM to create budget models and save a bit here and there. But adding VRAM to a top model is a tricky endeavour
I don't know what you are referring too, maybe it's aliexpress level and they skimped on wire thickness. Most of the cost is the pure copper cable here, which should indeed cost 100+ for a 16ft cable, if you don't want it to melt when it's hot outside.