• Lauchmelder@feddit.org
    link
    fedilink
    arrow-up
    67
    ·
    3 个月前

    Why spend 30 seconds manually editing some text when you can spend 30 minutes clobbering together a pipeline involving awk, sed and jq

      • Tangent5280@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        3 个月前

        The important part is to learn the limits of any tool. Nowadays I no longer use jq for any long or complicated tasking. Filter and view data? jq is fine. Anything more and I just cook up a python script.

          • Tangent5280@lemmy.world
            link
            fedilink
            arrow-up
            4
            ·
            3 个月前

            How do you get complex data structures to work? I was alienated from scripting on zsh because I wanted something like a dict and realised I would have to write my own implementation. Is there a work around for that?

            • tal@lemmy.today
              link
              fedilink
              English
              arrow-up
              6
              ·
              edit-2
              3 个月前

              I mean, there’s a point in data structure complexity where it’s useful to use Python.

              But as to dicts, sure. You’re looking for zsh’s “associative array”. Bash has it too.

              zsh

              $ typeset -A mydict
              $ mydict[foo]=bar 
              $ echo $mydict[foo]
              bar
              $
              

              bash

              $ typeset -A mydict
              $ mydict[foo]=bar
              $ echo ${mydict[foo]}
              bar
              $
              
              • Tangent5280@lemmy.world
                link
                fedilink
                arrow-up
                4
                ·
                3 个月前

                This will do nicely - I had several workflows where I’d hit an API and get a massive super nested JSON as output; I’d use jq to get the specific data from the whole thing and do a bunch of stuff on this filtered data. I pretty much resigned to using python because I’d have successively complicated requirements and looking up how to do each new thing was slowing me down massively.

  • Ŝan • 𐑖ƨɤ@piefed.zip
    link
    fedilink
    English
    arrow-up
    66
    ·
    edit-2
    3 个月前

    Ok, þe quote misplacement is really confusing. It’s

    awk '{print $1}'
    

    How can you be so close to right about þis and still be wrong?

  • lime!@feddit.nu
    link
    fedilink
    English
    arrow-up
    22
    ·
    3 个月前

    my favorite awk snippet is !x[$0]++ which is like uniq but doesn’t care about order. basically, it’s equivalent to print_this_line = line_cache[$current_line] == 0; line_cache[$current_line] += 1; if $print_this_line then print $current_line end.

    really useful for those long spammy logs.

    • grrgyle@slrpnk.net
      link
      fedilink
      arrow-up
      3
      ·
      3 个月前

      Oh that’s very interesting. I usually do sort --unique or sort [...] | uniq if I need specific sorting logic (like by size on disk, etc).

      • tal@lemmy.today
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        3 个月前

        Looking at the above awk snippet, it’ll retain order, though. So, sort will normally change the order. The awk snippet won’t, just skip occurrences of a given line after the first. Depending upon the use case, that order retention could be pretty desireable.

  • CubitOom@infosec.pub
    link
    fedilink
    English
    arrow-up
    19
    ·
    edit-2
    3 个月前

    I’ve become a person that uses awk instead of grep, sed, cut, head, tail, cat, perl, or bashisms

  • DreamButt@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    3 个月前

    In all my years I’ve only used more than that a handful of times. Just don’t need it really

    Now jq on the other hand…

  • otacon239@lemmy.world
    link
    fedilink
    arrow-up
    15
    ·
    edit-2
    3 个月前

    I used awk for the first time today to find all the MD5 sums that matched an old file I had to get rid of. Still have no idea what awk was needed for. 😅 All my programming skill is in Python. Linux syntax is a weak point of mine.

    • Ephera@lemmy.ml
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 个月前

      Probably the very same thing that the post talks about, which is extracting the first word of a line of text.

      The output of md5sum looks like this:

      > md5sum test.txt
      a3cca2b2aa1e3b5b3b5aad99a8529074 test.txt
      

      So, it lists the checksum and then the file name, but you wanted just the checksum.

      • tal@lemmy.today
        link
        fedilink
        English
        arrow-up
        15
        ·
        edit-2
        3 个月前

        To be fair, a lot of the programs don’t use a single character, have multiple spaces between fields, and cut doesn’t collapse whitespace characters, so you probably want something more like tr -s " "|cut -d" " -f3 if you want behavior like awk’s field-splitting.

        $ iostat |grep ^nvme0n1
        nvme0n1          29.03       131.52       535.59       730.72    2760247   11240665   15336056
        $ iostat |grep ^nvme0n1|awk '{print $3}'
        131.38
        $ iostat |grep ^nvme0n1|tr -s " "|cut -d" " -f3
        131.14
        $
        
        • TechLich@lemmy.world
          link
          fedilink
          arrow-up
          9
          ·
          3 个月前

          I never understood why so many bash scripts pipe grep to awk when regex is one of its main strengths.

          Like… Why

          grep ^nvme0n1 | awk '{print $3}'

          over just

          awk '/^nvme0n1/ {print $3}'

          • FooBarrington@lemmy.world
            link
            fedilink
            arrow-up
            12
            ·
            edit-2
            3 个月前

            Because by the time I use awk again, I’ve completely forgotten that it supports this stuff, and the discoverability is horrendous.

            Though I’d happily fix it if ShellCheck warned against this…

        • ThunderLegend@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 个月前

          This is awesome! Looks like an LPI1 textbook. Never got the certification but I’ve seen a couple books about it and remember seeing examples like this one.

    • Laurel Raven@lemmy.zip
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 个月前

      This is definitely somewhere that PowerShell shines, all of that is built in and really easy to use

      • Laser@feddit.org
        link
        fedilink
        arrow-up
        3
        ·
        3 个月前

        People are hating on Powershell way too much. I don’t like its syntax really but it has a messy better approach to handling data in the terminal. We have nu and elvish nowadays but MS was really early with the concept and I think they learned from the shortcomings of POSIX compatible shells.

        • Laurel Raven@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 个月前

          I really can’t stress enough how much power and flexibility comes with an object oriented shell, especially with the dotnet type system behind it.

          I think most people who hate it just do so either because it came from Microsoft (which… Yeah, that’s understandable), or because it’s a different way of thinking about it (and/or they spent a lot of effort learning how to parse data from strings effectively and hate that it’s made easier?). But love or hate it, it is effective and powerful, and I find myself missing that when working with bash.

    • bulwark@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 个月前

      I remember when I first stumbled across this manual I was trying to look up a quick awk command and wound up reading the whole thing. It’s really one of the better GNU manuals.

  • pelya@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    3 个月前

    Everything you do with awk, you can do with python, and it will also be readable.