Just Use Curl

155 pointsposted 11 hours ago
by imiric

127 Comments

hypeatei

10 hours ago

There's also hurl: https://hurl.dev

You define all your requests in a plaintext format and can inject variables etc... plus the name is kinda funny.

jicea

5 hours ago

Hi maintainer of Hurl here, thanks for the link; Hurl is using libcurl as its HTTP engine (you can for instance use --curl to get an export of curl commands). In a few words, if you just use Hurl, you just use curl also...

dprotaso

5 hours ago

Curious is there a go lib that can read the config file? I’d love to have this in some testing to make it more accessible.

anotherhue

8 hours ago

Emacs had something like this long ago (of course it did).

https://github.com/pashky/restclient.el

I also like httpie but they seem to have gone commercial.

sointeresting

7 hours ago

I use emacs restclient all the time. I have requests saved in my orgmode notes that I can run with a simple C-c C-c. It's great.

4b11b4

5 hours ago

You're using this as well?

https://github.com/alf/ob-restclient.el

sointeresting

2 hours ago

Ya I babel all sorts of things. I have sql queries saved in my notes for things like creating a user after I reset the project locally. Common rest queries. I create an orgmode heading for every ticket I pick up and often end up with rest and/or sql queries littered in the notes for testing and development and I can run them right from the notes and when I'm done with the ticket I can just copy/paste them into the pull request description to fill out the manual test steps for those reviewing.

blueflow

10 hours ago

Basically a shell script with a curl invocation in it, except you need to install extra software to execute it. Misses the point of the article.

hypeatei

10 hours ago

I mean, the article did say:

  It doesn't need to render a fucking Chromium instance to make a web request. It doesn't depend on a service to run. It doesn't require an "Enterprise" subscription for basic features.
So I'd say it meets all of the criteria except being on your machine already.

millerm

9 hours ago

Yup. Article mentioned `jq` as well. That's an external tool (and so is cURL if we are all being honest here).

bravetraveler

9 hours ago

Their words, not mine; the first header: "It's already on your machine". We can belabor it, but the domain is 'justuse'. No room for 'except' [unless you're reasonable, of course].

The 'egregious' things are charging to share what will fit very well in SCM (preventing real automation)... and breaking due to Online First/only. It makes sense to require the endpoint I'm talking to. Why would Postman need AWS/us-east-1 [0] for a completely unrelated API? Joyful rent-seeking.

cURL, your suggestion (hurl), or HTTPie all make far more sense. Store what they need in the place where you already store stuff. Profit, for free: never go down. Automate/take people out of the button-pushing pattern for a gold star.

0: https://news.ycombinator.com/item?id=45645172

c0balt

10 hours ago

> The UI/UX is perfect

While I like curl, this is highly subjective, some people just prefer a GUI that can guide you and/or be visually explored.

This whole piece also reads like someone is quite angry at people preferring a different workflow than them. Some aspects, like shell history, are also not the magic bullet they propose here as it doesn't, e. G., cover the actual responses.

Curl's ability to do almost everything is a minor curse here too as it means that any documentation (man pages, options help) is very large.

catapart

10 hours ago

Just fyi: the writing style is a meme. I think it's following the "just use html"[0] format (though, that may be its own riff on the meme).

Not to detract from you point; just to say that the author is probably not as angry as this could make them seem.

[0] https://justfuckingusehtml.com/

c0balt

5 hours ago

Thank you for that pointer/explanation. That does make the choice of words understandable to some extent even if it feels a bit excessive.

alt187

10 hours ago

Thanks! I didn't know this gem, definitely saving it.

dlopes7

10 hours ago

Not to mention saving headers, tokens, doing multiple requests using the results from the previous, etc.

This guy would say "just use bash" and ignore the average user experience.

andoando

10 hours ago

You can do all that in a shell script though

dns_snek

10 hours ago

Of course you can, but shell scripting really fucking sucks.

One moment you have a properly quoted JSON string, the next moment you have a list of arguments, oops you need to escape the value for this program to interpret it right, but another program needs it to be double-escaped, is that \, \\, or \\\\? What subset of shell scripting are we doing? Fish? modern Linux bash, macOS-compatible bash? The lowest common denominator? My head is spinning already!

If I want to script something I'm writing Python these days. I've lost too much sleep over all the "interesting" WTF situations you get yourself into with shell scripting. I've never used Hurl but it's been on my radar and I think that's probably the sweet spot for tasks like this.

andoando

7 hours ago

Ok agreed. But you can also just use any scripting language of your choice. Now you have a completely open development platform for your APIs.

I guess parsing cmd line outputs would be annoying. Would be a worth while library to write

skydhash

10 hours ago

That’s mostly what I do when I need to interact with an API:

  _plz() {
    curl [rest of common args]
  }
Then:

  _plz GET endpoint

JeremyNT

5 hours ago

I love curl and use it all the time (copying requests from the browser is maybe the most common usage) but:

> Q: But Postman has testing and automation!

> A: So does cURL in a shell script with || and && and actual programming languages. You want assertions? Pipe to grep or write a 3-line Python script. Done.

IMO if you're reaching for an "actual programming language" it's probably time to put curl down and switch to libcurl or whatever native equivalent is in that language.

theshrike79

5 hours ago

The Google style guide is a good reference: https://google.github.io/styleguide/shellguide.html#s1.2-whe...

  - If you are writing a script that is more than 100 lines long, or that uses non-straightforward control flow logic, you should rewrite it in a more structured language now. Bear in mind that scripts grow. Rewrite your script early to avoid a more time-consuming rewrite at a later date.
  - When assessing the complexity of your code (e.g. to decide whether to switch languages) consider whether the code is easily maintainable by people other than its author.

cassianoleal

4 hours ago

Cute, but a few years back $client was an early-ish adopter of Anthos Service Mesh (Google-managed(ish) Istio for GKE), and to install it we had to run a Bash script that was over 1,000 lines long.

When we questioned the Google engineer assigned to support us, he snickered and said "you can trust it".

runxiyu

10 hours ago

Using -X POST is often wrong as it "changes the actual method string in the HTTP request [... and] does not change behavior accordingly" (Stenberg, 2015).

Although, it is correct for the article's mention of "Send POST requests"... just that typically people don't send POST requests out of the blue with no data.

kaoD

10 hours ago

I think you misinterpret the text you're quoting (but it's hard to tell since you didn't include a link).

-X POST is not wrong, it's just superfluous when using other flags like -d where the method can be inferred.

POST requests are often sent with no data (anything that is not idempotent should, unless there's another verb that could fit better).

jmholla

10 hours ago

Here's the article in question. [0] I think runxiyu is correct.

The author delves a bit more into the issue.

> One of most obvious problems is that if you also tell curl to follow HTTP redirects (using -L or --location), the -X option will also be used on the redirected-to requests which may not at all be what the server asks for and the user expected. Dropping the -X will make curl adhere to what the server asks for. And if you want to alter what method to use in a redirect, curl already have dedicated options for that named --post301, --post302 and --post303!

Per the man page (`man 1 curl`),

> The method string you set with -X, --request will be used for all requests, which if you for example use -L, --location may cause unintended side-effects when curl does not change request method according to the HTTP 30x response codes - and similar.

`-d` and `--data` will appropriately change the headers of their requests. Funnily, `--post301` and `--post302` which have a similar effect as `-X POST` are RFC 7231 compliant, browsers just don't do that. [2][3] This is so ubiquitous that the error codes 307 and 308 were added to support the original behavior of repeating the request verbatim at the target address. Compare the following:

    > nc -l -p 8080 -q 1 <<< $'HTTP/1.1 301 Moved Permanently\nLocation: http://localhost:8081\n\n' & nc -l -p 8081 -q 1 <<< $'HTTP/1.1 200 OK\nContent-Length: 0\n\n' & curl -L --data test localhost:8080; wait
    > nc -l -p 8080 -q 1 <<< $'HTTP/1.1 301 Moved Permanently\nLocation: http://localhost:8081\n\n' & nc -l -p 8081 -q 1 <<< $'HTTP/1.1 200 OK\nContent-Length: 0\n\n' & curl -X POST -L --data test localhost:8080; wait
    > nc -l -p 8080 -q 1 <<< $'HTTP/1.1 308 Permanent Redirect\nLocation: http://localhost:8081\n\n' & nc -l -p 8081 -q 1 <<< $'HTTP/1.1 200 OK\nContent-Length: 0\n\n' & curl -L --data test localhost:8080; wait
What happens here:

1. In the 301 case with just `--data`, the request turns into a GET request when sent to the redirect.

2. In the 301 case with `-X POST`, the request stays a `POST` request, but doesn't send any data to the redirect.

3. Finally, in the case where the server returns a 308, we see the POST request is kept and the data is resent.

To further expand slightly on a different thing that might surprise some people, the data options will automatically set the content type by adding the header, `Content-Type: application/x-www-form-urlencoded`, as if sending form data from a browser. This behvaior can be overridden with a manual `-H`, `--header` argument (e.g., `-H 'Content-Type: application/json`).

Edit: cube00 pointed out that newer versions of curl than mine have `--json` which will do that automatically. [4]

[0]: https://daniel.haxx.se/blog/2015/09/11/unnecessary-use-of-cu...

[1]: https://www.rfc-editor.org/rfc/rfc7231

[2]: https://evertpot.com/http/301-moved-permanently

[3]: https://evertpot.com/http/302-found

[4]: https://news.ycombinator.com/item?id=45655409

novoreorx

8 hours ago

When I made the script httpstat [1] 9 years ago, I had this exact thought - if I want to show the statistics of a HTTP request, why not just use curl, why bother working out the figures myself? And since then, the more I use curl, the more I find it robust, sophesticated and irreplaciable. It's the only thing and everything I need.

[1]: https://github.com/reorx/httpstat

cosmotic

7 hours ago

The author could benefit from some research into user centered design. CLI is notorious for poor discoverability and consistency, two halmarks of GUI (at least 20 years ago; these days less focus is put on these elements). Humans are very not good at remembering command line flags but great at looking at then manipulating a screen that shows fields for all the flags.

cassianoleal

4 hours ago

That may be true for a tool that's not used often. For professional tooling, muscle memory and response time beat discoverability any day of the week.

GUIs are (usually) slow and break the flow. They cater for the bottom of the pile of users, in detriment of the top.

taylodl

10 hours ago

Curl and jq are plenty to get the job done. As the author points out, you can capture all your curl commands in scripts and then orchestrate with other scripts for testing purposes. On top of all the benefits the author has already mentioned, you get a boost if you're doing your development work in a VM, as I do. It's less you have to install, configure, and manage. Sure, that can be automated, but it's just more stuff you have to take care of and the longer you have to wait for a fresh VM to be ready.

brap

10 hours ago

After decades it should have been obvious by now that the vast majority of users prefer GUIs.

CaptainOfCoit

10 hours ago

Yeah, hence why no one uses terminals for doing work with git, codex, claude-code, neovim and curl, everyone and their mother are using desktop GUIs for those things, clearly.

c-hendricks

10 hours ago

Inversely, plenty of people use GUIs for all of those things.

CaptainOfCoit

10 hours ago

Yes, I agree, and have no qualms with people who prefer GUIs. But to say that "majority of users (developers?)" prefer GUIs seems outlandish at best.

____mr____

10 hours ago

I am the only person in my 10 person team that prefers the cli for stuff like git and while the ratio was a little more balanced during my time at college, it was still skewed towards GUIs. I don't think its unreasonable to think that developers might prefer GUIs over CLI

maleldil

8 hours ago

Many developers are on Windows, and those tend to prefer GUIs. CLI reliance is mostly a macOS/Linux thing.

hurricanepootis

8 hours ago

I know some window developers use use WSL for all of their devving, and they are obviously gonna be working through the terminal.

aitchnyu

10 hours ago

I use Jetbrains git pull etc since they dont fail on modified files and other problems.

nailer

10 hours ago

The ‘vast majority’ using GUIs as mentioned by the parent doesn’t preclude individuals using CLIs.

catapart

10 hours ago

Yeeeep.

My response to this article is the same one I have to anyone out here screaming "it's so easy to just do it my way!": if it's so easy, then do it for me!

The ffmpeg fans are the loudest screechers I've found, in this area. They'll point to the trainwreck of a UX that is Handbrake as an example of GUI for terminal commands. And, look - command line utilities are great and Handbrake is a super good product that functions well and does more than I'd ever want it to. But neither of those things are the same thing as having good UX.

If it's so easy to compile a bunch of shell scripts and store them in a directory in a git repo, then package together a bunch of them that every dev would need, sprinkle in a few with placeholders that most devs would need (with some project-specific input), and then serve them to me in a composable GUI that, in real time, builds the command to be run in my shell. Let me watch the clicks edit the command, and then let me watch the "submit" execute the command. There's no surer way for me to learn exactly what commands I need than to see them all the time. And if I have to learn them (so that I can use them) BEFORE I've seen them - in context - a few times at least, then I'm going to have a much harder time remembering. UX, when done right, helps the user.

Put simply: if I can do everything I would be able to do with postman using curl, then I should also be able to wrap curl in a thin DearImgui window that is reactive to user input. And if it's as easy as the author says, their time would have probably been better spent just making the GUI wrapper app and presenting it as a way to get better with curl, rather than writing an edgelordy article about it.

skydhash

10 hours ago

You can’t compose GUI unless you go the emacs and smalltalk route (MacOS AppleScript is a very poor example). The one thing with CLI tools is how easy to compose them and have something greater. Also more versatile than a GUI app.

Also they’re more stable than anything else. You can coast for decades on a script.

catapart

10 hours ago

Anything you can do in a terminal can be done with what I suggested because what I suggested was literally using buttons to write a terminal command.

skydhash

9 hours ago

Not really. In a terminal the whole input can be dynamic with variables, pipe and executors like find’s exec, xargs and parallels. That’s pretty much the whole point of a shell and CLI interaction.

catapart

7 hours ago

yes, really. If you think what I'm describing cannot enter text into a cli, you are thinking about what I am describing wrong. There's nothing you can put onto a command line with a keyboard that cannot be put there by an app using string manipulation. You're welcome to try to describe something, though.

skydhash

6 hours ago

There is nothing that you can’t. Visual programming languages do exist and the shell is an REPL. What’s important is how well you can do it. If you nail down the common use cases, you can create a nice wrapper and people have done so.

But text is very versatile. Adding another layer on top is losing that versatility. And while graphics is nice, symbolic manipulation is on a whole other level.

So fo a closed, and I guess small, you can have gui for intuitiveness. But if you want expressivity, you need symbols and formalism.

But there’s one thing that still beat Graphic in terms of intuitiveness. Tacticality. I’d bet that it’s way faster for a person to learn a physical car dashboard than a touchscreen one.

pjc50

9 hours ago

What's supposedly wrong with Handbrake UX?

To me it seems like the complexity is just irreducible. There's so many formats, so many bits and pieces that can go in a video stream, they're not very visualizable, and they have surprising edge case interactions. Not to mention there's a lot of "normally the program figures this out for you, but there's an option to override it if broken" knobs and dials.

catapart

7 hours ago

Good UX is not about reducing complexity, nor is it about hiding complexity. It's about surfacing the exact utility a user needs in the exact context they are best suited to understand each of its inputs entirely (with the least friction in generating those inputs for the user). It's very hard to do. So much so that describing 'what is wrong' with a UX would be almost as burdensome as just designing a better UX. So I'm not going to tell you what is wrong with it; you KNOW what is wrong with it. It could be better. That you can't specify as to how just means that you aren't currently undertaking the complicated process of redesigning it. It doesn't mean you don't know good UX from bad UX.

Now, all of that aside, I do like Handbrake and I do think it offers a ton of functionality with so little friction that it's one of my very favorite and most-used apps. No login, no project setup, no x, no y, no z. Just a thin wrapper around a badass command line utility, with tons of options for users to override, and sensible defaults. There's a lot to love about Handbrake!

But "my grandma can use it", or "a plumber can use it", or "a person who doesn't understand the technicals and just needs to do one stupid thing that the app can definitely do, can use it" are signs of good UX. You wouldn't say any of those things about Handbrake.

hurricanepootis

8 hours ago

In my experience, handbrake doesn't expose every option from ffmpeg and is more focused on transcoding. One nitpick I have with handbrake is that it doesn't support VAAPI encoding nor Vulkan Video Encoding for AMD cards on Linux.

Aperocky

10 hours ago

define "user"

people needing/using curl would have been a very distinct subset of users.

internet_points

10 hours ago

TIL about jo, why did I not know about that ten years ago

    $ jo details[author]=Dickens details[age]=dead books="$(jo -a 'Oliver Twist' 'Great Expectations')"
    {"details":{"author":"Dickens","age":"dead"},"books":["Oliver Twist","Great Expectations"]}

cyberge99

3 hours ago

Is curl ever going to have a mature /sane way to handle silent output? I seem to have to redirect 2>&1 and other -o- options. It’s annoying. Always has been.

zamadatix

10 hours ago

Curl is great for running individual API calls, API clients are great for when you're actually working on the API or architecting an app with one. Not that there are things one can't do with API development from the CLI, but there is a lot more to API clients than that feature list (it doesn't even include anything around stuff one might want to do with openapi definitions!) and by the time you string it all together you get why people like the tool that did that for them instead.

andrewrn

10 hours ago

I might get lit on fire for this, but I don't find manpages very easy to use. If want to quickly remember an option or argument order, I am met with a wall of text.

Does anyone have tips for how to make it more useful? Maybe I could grep better for options? For example in the link, the author lists out common curl commands like making a POST request or adding a header. If you tried to look through the manpage for this, this would take a long time.

There's another utility called tldr that does a better job of this by providing common usage examples that almost always instantly give me what I need, but its not nearly as comprehensive as man.

sroerick

9 hours ago

I actually completely agree. I am learning OpenBSD and the man pages are very good, but all too often I find myself reading them, beating my head against the wall, and then googling or using tdlr or gippity.

For example, I just was digging into BSD_auth and authenticate, and I don't know much about how auth works generally. I found it pretty tough to grok from the man page. I love the idea of learning everything from directly within the system and man pages, but I might just not be smart enough for that.

mfiro

10 hours ago

Add this to the end of .bashrc

function cheat() { curl cht.sh/$1 }

Then in terminal you can use the following to see the examples: $ cheat curl

basscomm

10 hours ago

> Does anyone have tips for how to make it more useful? Maybe I could grep better for options? For example in the link, the author lists out common curl commands like making a POST request or adding a header. If you tried to look through the manpage for this, this would take a long time.

You can search a man page by pressing the '/' key, typing in what you want, and pressing 'enter'. 'n' jumps to the next instance of your search string 'N' jumps to the previous instance.

andrewrn

an hour ago

Okay good point… I still think common usage examples would be nice because I don’t even know what some commands are capable of

hurricanepootis

7 hours ago

I often find it easier to read man pages in the browser. Use `man --html=<browser executable>`.

gbuk2013

10 hours ago

Ask your AI tool of choice - it’s great at reading manpages. I also add my most favourite prompt instruction “be succinct”. :)

eddywebs

10 hours ago

This is a very funny, love it ! every developer needs to know basics like such, I hope this site continues.

garciasn

10 hours ago

This person is my spirit animal. "You can remember 400 Kubernetes commands but not curl -x POST?"

That was exactly what I needed this morning.

jonathanberger

10 hours ago

Does anyone have suggestions for when I need to use bearer auth and the token is super long?

With curl I end up finding the command becomes hard to read, even taking advantage of backslashes. With Postman, it tidily hides the token out of the way on a separate tab and gets out of my way.

humlex

10 hours ago

What i do is assign the token to a variable. I typically copy the secret to my clipboard, and then use the pbpaste command in macos terminal when assigning it to avoid secrets in my command history.

Izkata

an hour ago

I don't know how consistent this is across shells, but at least in bash putting a space before the command keeps it out of the history:

  $ ONE=1
  $  TWO=2
  $ echo $ONE $TWO
  1 2
  $ history | tail -n 4
   2002  clear
   2003  ONE=1
   2004  echo $ONE $TWO
   2005  history | tail -n 4

miningape

9 hours ago

this.

A while ago I was working on a DSL to solve this exact issue (env switching, http requests + chained requests e.g. to an auth server to retrieve a token) - but I haven't had the time recently, and I moved jobs to a GraphQL shop, so it feels a bit more pointless now :D

KORraN

8 hours ago

I love the second part of your tip, thank you.

skydhash

10 hours ago

I think curl can load headers or other data from a file. And you can always $(cat token.txt) in the cli.

hi41

10 hours ago

I am a total newbie to curl. I am so excited to come across this post. Thanks, op! I want to use curl to send json and xml requests instead of using Postman and SoapUI while also using a jks file which stores a certificate for secure connection to API.

(Clarified regarding certificate)

opengrass

6 hours ago

Business idea: Firefox or Chromium fork replacing http(s), or beg Ladybird developers to make it standard.

eddieroger

10 hours ago

It's clear at this point that terminal apps have lost to GUIs, but cURL is the one place where I think that's a shame. cURL /always just works/. It is predictable, consistent, transparent, and pretty easy to use in its simple forms, but with plenty of room for complexity if you wish to go the far. There's a reason libcurl is in everything from automobile infotainment systems to toasters. I'm glad to use a GUI over libcurl that doesn't also need a cloud to work, but at the end of the day, I find myself piping cURL to jq more than almost anything else.

Way back when Postman was but a mere Chrome plugin, I spent a lot longer than I'd have liked fighting with a request that should have been logging GET requests but wasn't. Imagine my surprise when I found that it was following Chrome's caching rules and not actually making my requests despite me intentionally firing off those requests. If only I had just used cURL...

cabirum

10 hours ago

Instead of "-X POST" and "-H 'Content-Type: application/json'", use "-d" or "--json", asshat.

wraptile

10 hours ago

Curl's UX is very dated and I wouldn't recommend it to any new user. Use many alternatives like httpie [1] or something like curlie [2] which is just an UX wrapper around the same libcurl. Httpie also has a postman-like web interface.

1 - https://httpie.io/

2 - https://rs.github.io/curlie/

behnamoh

10 hours ago

curl may be outdated but at least it's a survivor. the alternatives (esp. the suspicious httpie grift) won't make it.

wraptile

10 hours ago

sure libcurl is great but curl CLI is pretty ancient and awful and completely unnecessary to use today as any front-end can plug into libcurl and provide a much better experience. If you're only requesting your own APIs then you don't even need libcurl and any http 1.1 client will provide much better experience like httpie.

The issue with "survivor" software is that UX cannot be refined due to legacy support and that's what's great about curl itself is that libcurl and CLI front-end are separate tools allowing for alternative modern front-ends.

xd1936

10 hours ago

Can you define "the suspicious httpie grif"?

wraptile

9 hours ago

My guess would be they don't like the fact that httpie is branching out to paid (well currently still 0$) GUI desktop/web apps. The CLI still remains under open source under BSD so I think OP is just yelling at clouds here.

deafpolygon

11 hours ago

I've been using curl, like forever. I don't understand the preoccupation for using postman, et. al. -- why pay for something that literally requires a little bit of light RTFM?

pjc50

10 hours ago

https://news.ycombinator.com/item?id=9224 "you can already build such a system yourself quite trivially by getting an FTP account, mounting it locally with curlftpfs, and then using SVN or CVS on the mounted filesystem. From Windows or Mac, this FTP account could be accessed through built-in software." -- on why Dropbox should not exist.

People absolutely will pay for software rather than reading or thinking, if it makes doing the work easier. You may have heard of this thing called chatgpt.

(not being a web developer, I've only lightly used Postman, and it is definitely handy for things like authentication. Especially once you touch OAuth. But I uninstalled it once they went unnecessarily cloud)

holletron

10 hours ago

Because it's convenient. I use curl often, but admit to using Bruno even more often. And yes, I could have some organized scripts or something, but for playing with various APIs daily, sometimes importing whole .json collections, or even setting up credentials in one place and reusing them across all the requests from a collection - that's just fast, easy and convenient. Same for responses - yes, I could work with jq and analyze in the console, but often I don't really know what exactly I'm looking for, so it's just easier to have it visually parsed and click through items

Xenoamorphous

10 hours ago

I use both. It's quite convenient to have all the collections available and just send the request with a couple of clicks.

CaptainOfCoit

10 hours ago

For repeated commands, my projects have a Make/Just file that has cURL commands I wanna validate. Sometimes I even load JSON from a tests/fixtures/*.json file, that also can be reused for other non-E2E tests.

Not sure how some developers could be so allergic to the terminal, don't you already spend a lot of time there?

Xenoamorphous

10 hours ago

> Not sure how some developers could be so allergic to the terminal, don't you already spend a lot of time there?

Who says I'm allergic to the terminal? I already stated that I use curl.

I could also ask why are some developers so allergic to any kind of UI. And they're very vocal about it. Just use whatever you want.

CaptainOfCoit

10 hours ago

> Who says I'm allergic to the terminal? I already stated that I use curl.

Preferring "a couple of clicks" vs "run one command" seems to indicate so, otherwise I'm not sure why'd someone would prefer the former instead of the latter.

Xenoamorphous

9 hours ago

I have dozens of collections with hundreds of requests, most sending complex payloads, all perfectly organised hierarchically. I'd rather use a collapsible UI for that, if you prefer to have hundreds of scripts in folders that's fine too.

Actually I don't even create those collections, we have OpenAPI/Swagger docs for all of our APIs and I just import them with a couple of clicks (which I'm sure there's a way to do with curl).

For the odd requests, and sharing requests with others? I use curl, no problem. I actually think I know it pretty well and very rarely need to look up any docs for it.

CaptainOfCoit

9 hours ago

> I'd rather use a collapsible UI for that, if you prefer to have hundreds of scripts in folders that's fine too.

No, I don't (what a shitty strawman), I create abstractions then, like any other project. Surely you don't have hundreds of completely original and bespoke requests? Previously I've handled thousands of requests by having a .csv to load from.

Xenoamorphous

7 hours ago

> Surely you don't have hundreds of completely original and bespoke requests?

Absolutely I do. It’s not like a few hundred endpoints is out of the ordinary in any mid sized company.

I can go an edit any of the requests with autoformatting, highlighting and whatnot.

As I say, you can keep your csv and abstractions, not trying to convince you that you should switch from whatever works for you.

troupo

10 hours ago

Even those who spend a lot of time in the terminal are capable of recognizing its limitations.

skydhash

10 hours ago

Maybe on work computer. But I can’t bother with installing, updating, and running these kinds of bloats on my personal computers. Only two software stay open for longer than a few hours: Emacs and firefox.

The thing with simple tools is that bootstrapping is easy and versatile.

eloisius

10 hours ago

I’m also a CLI old timer, but there’s undeniable utility in having a Postman-like collection to test drive a mobile app API. You can save state from responses and use it in subsequent requests. E.g. log in, save the access token, create a post, save the id, post a comment under the post using the id. It’s all very useful, to say nothing of the fact that you can give said collection to non-technical stakeholders and they can solve a lot of their own problems without going to get one of the engineers to Do A Command Line(tm).

All that said, I wouldn’t touch Postman. Last time I needed something to fit this bill I looked around to find the open source equivalent and found Bruno.

hi41

10 hours ago

I am new to curl. We use secure connection to Visa and Maactercard API. We have the certificate in a jks file.

Is there a way to send the json request that one sends in Postman but in curl while also using the jks file?

Similarly, we use SoapUI to send XML requests. Is there a way to send those XML requests using curl while also using the jks file?

Greatly appreciate your help.

cluckindan

10 hours ago

Come on. One google search for curl jks and the first StackOverflow result has what you are looking for.

cluckindan

10 hours ago

Some people like to think about the problems related to the actual work instead of looking up CLI tool manpages when they need to do something once in a blue moon.

I use curl liberally and also tend to create scripts around it to perform common tasks, but I still get why someone would prefer a GUI.

qsort

10 hours ago

If you're doing a lot of requests for testing or some other purpose I could see an argument for a graphical interface. Curl is a masterpiece but it's not that simple to use. But again, we're in $current_year and I'd be surprised if "hey Claude, can you cook up a curl request to do this and that" doesn't work.

andoando

10 hours ago

I use Postman a lot but this article was rather convincing.

Just save your requests in separate script and organize them.

And now you can run them from anywhere, including other scripts

CaptainOfCoit

10 hours ago

Even simpler and free: `tldr curl` in your terminal gives you like 80% of what you need for day to day requests, `man curl` gives you 100% of what you need.

blueflow

10 hours ago

> ... literally requires a little bit of light RTFM?

This is why, people do not want to bother with the docs.

nucleardog

10 hours ago

cURL is an amazing tool, but it's more "HTTP client" and less "full blown API client".

The page even sort of acknowledges this... saying you manage your environments with environment variables. It doesn't mentioned how to extract data from the response, just jq for syntax highlighting. No explanation of combining these two into any sort of cohesive setup for managing data through flows of multiple requests. No mention anywhere on the page of working with an OpenAPI spec... many of the tools provide easy ways to import requests instead of manually reentering/rebuilding something that's already in a computer-readable format.

So the tl;dr here is "use cURL, and then rebuild the rest of the functionality in bash scripts you idiot".

I went down this path of my own accord when Insomnia was no longer an option. I very quickly found myself spending more time managing bash spaghetti than actually using tools to accomplish my goals.

That's why I use a full blown dedicated API client instead of a HTTP client. (Not Postman though. Never Postman.)

antisthenes

10 hours ago

Using curl, how would I send a collection of frequently used requests to a coworker?

Plain text file?

andoando

10 hours ago

Also mentioned in the article. Put it in scripts and pit them on git, and you have version control now too.

users/create.sh, users/delete.sh, etc

skydhash

10 hours ago

> Plain text file?

Works everywhere.

It could be a script or a markdown with code blocks. I believe there’s wrapper with a more codified formats like .http.

ebiester

10 hours ago

They're put in git. You treat it like any other source. You might even have them as a set of shell scripts.

deafpolygon

10 hours ago

Yes, plain text. Copy & paste has worked for a very long time.

troupo

10 hours ago

postman used to be a relatively lightweight client with a tolerable UI to quickly set up and test what you needed.

Then it turned into a monstorsity.

Havoc

10 hours ago

> It's already on your machine, dipshit

Does anyone actually enjoy this uhm style of writing?

seandoe

10 hours ago

I like it. It makes me smile. Don't take it personally.

bravetraveler

10 hours ago

Ditto, enjoy the catharsis. Good advice to not take it personally, I'll try to give a less aggressive point of view. All of this has come to mind [but not repeated out of kindness or laziness, whichever].

So, to start: someone wants me to install Postman/similar and pay real money to share and make a request? Absolutely not. I can read the spec from Swagger, or whatever, too... and write down what was useful [for others]. We all have cURL or some version of Python.

Surely a few phrases of text worth making plans to save, and paying for [at least twice, you to research and them to store], are worth putting into source control. It's free, even gifts dividends. How? Automation that works faster than a human pushing a button. Or creates more buttons!

DoctorOW

10 hours ago

Foul language has never really bothered me, and I think it's effective in communicating a relatable (to me) frustration with people ignoring the answer staring them in the face.

> The tools you need are simple. They're fast. They're reliable. They've been battle-tested by millions of people for years. Just fucking use them.

kenforthewin

10 hours ago

It's a very millennial flavor of Reddit-coded edginess that appeals mostly to other Reddit millennials.

deaux

10 hours ago

I like reading it sometimes. It doesn't make me more likely to do what it suggests though, if anything potentially less likely. Like a "Haha, what a guy. Now let's get on with my day" kind of vibe.

forgetfulness

10 hours ago

13 years old me, back when Maddox offered a very fresh and novel form of content you could grab on your way through the Information Superhighway

taylodl

10 hours ago

It takes me back. This was normal discourse in the 80s and 90s in the development community, especially the BBSs and telnet communities. I think the entire development community back then was afflicted with Tourette's Syndrome!

amelius

10 hours ago

As long as I agree to what is being said, I like this style :)

crims0n

10 hours ago

Me, I have a dark and immature sense of humor.

antisthenes

10 hours ago

What a weird place to try and raise moral panic.

Do you ask people "Do you actually enjoy talking like that?" every time you hear a curse word?

Havoc

7 hours ago

> What a weird place to try and raise moral panic.

Sigh. No moral panic involved and I don’t care if people swear. I asked about the style for a reason.

It’s a bit like if someone makes technical posts written in archaic English or in pirate speak. They’re free to do so of course but it’s still a weird choice given context

bdcravens

10 hours ago

> Now everyone's downloading 500MB Electron monstrosities that take 3 minutes to boot up just to send a fucking GET request.

While curl is fine, most of the time I use the REST Client extension in VS Code. While VS Code is an Electron monstrosity, assuming you already have it, that extension is less than 3MB.

Even the full-feature GUI extensions like Thunder Client are scarcely bigger.

Hate VS Code, and never let your hands touch anything other than vim or emacs? Fine, there's a number of extensions that run in the browser that do the same thing.

eknkc

10 hours ago

I use httpie (not httpie actually, https://github.com/ducaale/xh but it has the exact same ux). For the life of me, I can't remember curl flags for some reason. Even fucking -X POST... Sending JSON is pain too.

For quick and easy http requests, httpie has been fantastic.

behnamoh

10 hours ago

xh is mandatory Rust rewrite of httpie :)

insane_dreamer

7 hours ago

I use httpie; like curl but with easier syntax.

on_the_train

8 hours ago

Unless you're on Windows, which comes with a special version of curl that misses some crucial functionality.

I didn't think I've the successfully used curl in my life though. Every time there's confusion about parameters. It's always been was faster to just write a quick python script that uses requests.

Plus the author can be a bit special. One of the most overrated pieces of software on the planet

ocdtrekkie

10 hours ago

Heh the home page is good too.