

The monitor sends you a list of accepted input formats. You can sanity check among the list for any outliers, without online information and without hardcoding limits.
Former account: @Redjard@lemmy.dbzer0.com
Keyoxide: aspe:keyoxide.org:KI5WYVI3WGWSIGMOKOOOGF4JAE (think PGP key)


The monitor sends you a list of accepted input formats. You can sanity check among the list for any outliers, without online information and without hardcoding limits.


I’d expect any current displayport port to handle very high refreshrates when the resolution is reduced correspondingly. The limit to my knowledge is in bitrate.
I’d also expect connector support to sit in the gpu driver.
A basic sanity-check might be the answer though. Still why not improve it instead of just increasing the number? You could check if the rate is an outlier or there are many profiles offered that climb up to that rate for example.


Not sure how far you wanna go. I know my way around mediawiki from the sysasmin side (installing, updating, installing extensions and themes, configuring weird features like ldap auth, …), the admin side (css, users and groups, templates and lua scripts), and some moderation (editing etiquette on wikipedia and a few other wikis, typical style guides, organization of pages and overview pages).
I’m quite busy lately, but you could ask me some questions via dm for example and I would be willing to do some small things.


If you measure response curves of individual cones and rods you won’t see any of the parameters go below the ms range, probably not even below 10ms. However the retina does receive bright short pulses as longer averaged signals. All the very high Hz vision cases see information of the same “object” spread over many cells in the retina. A trail showing up as many distinct images vs a long smear.
If you couldn’t move your eyes the limit would be lower, but because you can’t the rendering cannot anticipate those effects and emulate them. Motion blur is what happens when you always “anticipate” the eye to remain static. If you could measure eye movement extremely well and react within well under a ms, you might be able to match motion blur to eye movement of a single person. Add a second observer and it already breaks down. Not that our sensors are anywhere remotely near making this possible.
Edit: I suppose this would mean if you integrated a display into contact lenses and got the latency right you would max out at lower Hz.


Shouldn’t be enums as refresh rates can be floating-point and in practice there also is a lot of weirdness out there, like 59.94Hz.
The timing really needs to be matched to the monitor, you don’t want a 60Hz monitor using the resources of a 1000Hz monitor at any point. It should also be handled by the gpu and gpu driver more than the os.
I don’t think it’s that easy and I struggle to think of a legitimate reason. To me it seems more like an arbitrary bounds-check on monitor info received via hdmi/displayport. Bad coding for sure, but also potentially a point where people are pushed to newer more problematic versions of windows as the older ones “don’t support new hardware”.


Why was this ever a hardcoded limitation?


It really isn’t. There’s a whole lengthy explanation of it here but tldw motion breaks it. Lower refresh rates leave single images instead of smooth trails, while if you track motion then slower refreshrates make stuff blurry while in motion.
I don’t think the video mentions it, but you could flicker the backlight to make tracked motion smooth, but then eye movements will see many individual images end up on your retina instead of motionblur.
If you wanna wiggle you mouse at high speeds while maintaining image quality, say for fps 180 noscopes, then you will easily see improvements into the 10s of kHz.


I would say if a country wants to swallow a group of people, they should add their language to the official languages.
There are sufficient examples of nations with multiple official languages, like Belgium and Switzerland each having 3.
Apply this backwards in time too and it solves the issues.
Don’t wanna add a random language to your state? Don’t annex stuff.


fact is:
The US dollar index, which tracks the greenback against a basket of six major currencies, has risen by just under 3% since the end of last month. The dollar’s surge against the euro has been especially emphatic, at just below 3.5%.
us media claimed:
The dollar’s strength is largely being “driven by demand for so-called safe-haven assets”, the Wall Street Journal said. Reuters was even more emphatic: “Dollar reclaims safe-haven mantle,” read one headline, among many similar ones.
brussels thinktank instead points out:
The US, a major oil and gas producer, has seen its currency surge as its export prices have risen, driven by the war’s negative impact on the world’s energy supply. Europe, conversely, is a net importer of fossil fuels, pushing the euro lower.
Most damaging for the ‘safe haven’ narrative is, however, the fact that US Treasury yields have actually risen since the start of the war. This is the exact opposite of what a genuine flight to safety would produce, as stronger demand for US debt would push Treasury prices up and yields down.
Schwer zu lesen aber glaube das heißt “nein”.
You’re counting technically at war? Then almost no country will qualify. Certainly not the us, which multiple countries gotta have declared war with.
we’re still enjoying each other’s friendship and love.
look at ven diagram
it’s complicated
I’d like to buy your set for 19.1k please, would you kindly deliver it to north korea?


This. Aegis does all of the points except offsite backups. And for good reason.
The Aegis app has no network permissions at all, which is obviously a massive boost for security and privacy. And besides, off-device backuping is a nightmare.
Syncing the Aegis backups made on change to some other server is better handled by a great dedicated app. Syncthing is the best such program (by far), though for the few files involved here nextcloud would work just as well.
I assumed he’d estimated it based on how distorted the face appears behind the glasses. I do that all the time.
At this angle it’s hard for me to do that, since I usually use the edges of the face to estimate it. negative glasses pull the line inwards, positive outwards. I can reliably tell when someone is wearing fake glasses (0 strength) for example, and probably estimate strength within 30% of the actual value.
If the image was higher res maybe I could estimate this case too. Or this professional optometrist is just a lot better at it than I am.
Strong negative glasses: (Note the faces contours in the glasses appearing well inside the faces contours around the glassed)

Fake glasses:

Positive glasses:

PS: Searching for generic terms yields 100% fake glasses, so I took a specific person I remember having strong glasses for myopia.


Would be good if you could include a basic explanation as to why. I didn’t know what TASS was, and thus didn’t know what you were talking about. A simple “fake false news, this is a Russian propaganda news page, OP keeps posting Russian propaganda” would have gone a long way.


OP gets a tag



© TASS, Russian news agency
This is what Russia claims, so may not be accurate and is likely misleading.
Are you unbiased in this matter despite your connections to big cheddar? I prefer moon gouda.