Hacker News new | past | comments | ask | show | jobs | submit login

I'm a senior frontend engineer doing a lot of code reviews and my team loves my MR (PR) submission standard:

- Require the author to attach screenshots of the feature in a markdown table format showing the before and after comparison. This is enforced using a MR template.

- Require the author to attach screen recording for complex user interactions.

Overall effect is that it saves everyone's time.

- The author has to verify that the feature works anyway, so taking screenshots / recordings doesn't take much addtional efforts.

- The reviewer can focus on code itself instead of checking whether the code works.

- Easy for both parties to spot design and UX issues/regressions visually.




But you still have to test the actual functionality, don't you? What if the author makes a gif of clicking a button, but doesn't record if the browser back button still works?

I'd think that for frontend, you'd have CI/CD pipeline that deploys the code into a staging server, where I can test it myself.


Code review != Testing.

A lot of people conflate these IMHO. Code review should not really be about whether it "works" or not. That's what tests are for.

Code reviews are about checking for code complexity, good use of abstractions, readability, etc.


this should be the most upvoted comment


You have to review tests and you are back at square one.


What's the point of checking for those if the bar for fully functional isn't met?


None. So don't do the review until CI passes.


And also to spread knowledge of aspects of the system across the team.


I remember a few companies ago we had a system that would deploy every branch to a separate subdomain (same name as the PR) that you could access. It was fantastically useful for just seeing what the deployed code would look like. I think (for UI things at least) this is a very reasonable solution.

Wish I could remember the github service that did this?


There is already Vercel, Netlify, etc. that support this feature.

You can also easily do your own version with Cloudfront and S3 or just a custom nginx config.


Vercel and Netlify do PR reviews these days. Was it in the JS space?

I think render.com is going to I introduce this soon too.


the neame is review app, it's on heroku since forever, also in gitlab ops today.


Isn't this what automated tests are for? Manually testing every change is a huge burden, requires most of the automation necessary for automated testing (from the infrastructure point of view), and actually discourages people from writing automated tests.


It's interesting that you brought up the issue of back button. It is indeed an area where bugs frequently occur.

I haven't found a good solution except manually asking in every MR that I sense a potential issue. Maybe it is a good idea to have it in MR template as a checklist item.

Another problem with back button is that the expected behaviour is usually missing in the design or requirements to begin with, requiring a lot of back-and-forth on what's actually expected (especially for SPA).


You can even automate this with something like Percy that takes screenshots and does visual diffs at selected points in the test suite. You just have to be careful about including random/date-based data in your tests or you’ll get more diffs than expected. Also not quite a substitute for someone calling out “this was a major UX change”.


Interesting to hear that there are automation solutions around this. In your experience / expertise, how much work can be automated by such tools? 80%? 99%?


This is an excellent idea for a submission standard, will have to store this in the back of my mind.


Word is actually really great for this stuff. You get a portable file and can add notes to the screenshots. There is even a screenshot button built into Word!

For recordings, if you use Windows you can bring up the game bar with Win+G. It has a very easy to use screen capture tool meant for gaming, but perfectly fit for small software demos. It even captures sound which is sometimes useful.


I'm working on an "instant replay for developers" desktop app for exactly this case. Create desktop instant-replays to embed in pull requests:

http://replayable.io/


The fact that people think this is an excellent idea.....screen recording the proof that the feature works wtf what is the alternative you are getting? People are submitting features that don't work? Just fire them or teach them your set of standards. If my boss asked me to do this I would laugh and find a new job. I truly feel sorry for you but also please stop ruining our industry with this nonsense.


Requiring screen recording isn't about lack of trust or lack of competence. Rather it's a way of eliminating information asymmetry between the author and the reviewer:

- The author might have checked that the code works fine (A) or haven't done it (B).

- The reviewer might decide to run the code locally (1) or not (2).

Combination A1 results in duplicate effort. Combinations A2 and B1 are perfect. Combination B2 results in potential bugs.

The MR standard merely eliminating the combinatorial possibilities by sharing information between the author and the reviewer automatically. The end result is that both parties know A2 is the process that the other person follows.


All you have to do in a code review is review the code. You do not have to run the code nor should you be unless it is a very tiny project. It is the developers responsibility to test the functionality and get it ready for the QA process. If there is a dedicated QA that can test their functionality then the QA is responsible for testing on that front. The code reviewer is not supposed to be testing the feature works as intended for product. The manager of the team or product or client or stakeholder would be the ones testing yhe functionality and giving it a go ahead. It is not the job of the code reviewer to do that. It means you have not taught your developers the set of standards and practices to follow. If you have to get a screen recording of a feature working (probbaly in local or dev so its as useful as a unit test) then you or whoever is responsible has failed at creating a proper software development process


You are 100% right and I agree with you completely.

I have failed to create a proper software development process.


Well I apologize if I came off as harsh. I don’t mean to say you specifically. I just adamantly believe that we need to work towards process that allow you to trust the developer. Every situation is unique however and I cannot speak on all but I do think screen recording proof of work is not useful. Also if people are forced to screen record there us no scalability in projects software development


Consider yourself lucky to be on a team where people are competent at testing the features they create.


Lucky? Im unlucky. Where tf are these jobs where you get away with submitting code that doesn’t work. The amount of process and procedures and standards we have to follow is a lot and it is exhausting. Need to find a job where i can get away with trash work


This is great, do you do this in github? Or elsewhere? Just curious if you had a template to hand that myself and others could use =)...


The PR/MR template itself is pretty generic. You can make one yourself quickly by referring to GitHub markdown table documentation: https://docs.github.com/en/get-started/writing-on-github/wor...


"MR" so probably Gitlab


Ah didn't know that, less familiar with gitlab.


I’ve found this works extremely well for visual changes and can’t believe when it’s not the default behavior!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: