Consolidated Business & Enterprise Computing Rant Thread

Discussion in 'Business & Enterprise Computing' started by elvis, Jul 1, 2008.

  1. theSeekerr

    theSeekerr Member

    Joined:
    Jan 19, 2010
    Messages:
    3,605
    Location:
    Broadview SA
    So that he can reinvent NoSQL, then reinvent relational databases on top of NoSQL, and then hook it up via exabit token ring 6G, and then not actually build any of it because I'm unconvinced that he builds anything other than the occasional misconfigured whitebox?

    Might be easier to google.
     
  2. gav1ski

    gav1ski Member

    Joined:
    Aug 9, 2001
    Messages:
    154
    Location:
    Sydney
    This is what I do, including the excel bit. I do update queries almost daily and I still google how to write an update using a select statement as it just does not seem to stick in my head.
     
  3. stiben

    stiben Member

    Joined:
    Aug 5, 2001
    Messages:
    464
    Location:
    Brisbane
    If you are processing a lot of records in a relational DB row-by-row, as Tom Kyte from Oracle says "you are doing slow-by-slow" ;)

    I recently got a process that was making users wait 2.5 hours down to 10 seconds by changing the code to work with the database rather than against it. It's also far easier to maintain for the next dev who looks at it .

    IMO you cannot be a good database application developer without being part DBA, and vice versa.
     
    theSeekerr likes this.
  4. Hive

    Hive Member

    Joined:
    Jul 8, 2010
    Messages:
    6,359
    Location:
    AvE
    Cisco mobility express. What a steaming pile of unrefined shit.
     
  5. cvidler

    cvidler Member

    Joined:
    Jun 29, 2001
    Messages:
    14,921
    Location:
    Canberra
    Yes, it's real easy to get something working with copypasta from google (especially in dev, when there's 4 rows in the test database), takes proper skill to get it working well.
     
  6. dakiller

    dakiller (Oscillating & Impeding)

    Joined:
    Jun 27, 2001
    Messages:
    8,308
    Location:
    3844
    If it was 2.5 hours for a one time thing, then I'll take the 2.5 hours. If it was something that happens on an ongoing basis, I got no problems learning how to optimise it, but it's going to take me longer than 2.5 hours to work it out.
     
  7. GumbyNoTalent

    GumbyNoTalent Member

    Joined:
    Jan 8, 2003
    Messages:
    9,782
    Location:
    Briz Vegas
    :thumbup:

    Let me guess, modified the query to start the transaction based on smaller logical units like per SOMETHING then for ALL THINGS WITH STATE OF thus limiting the transaction to smaller logical blocks? Because lazy dev mode "my code is fast when database is 100 rows" and then when in production 5 years later is 2 billion rows. :rolleyes:
     
  8. stiben

    stiben Member

    Joined:
    Aug 5, 2001
    Messages:
    464
    Location:
    Brisbane
    Not quite :) And actually using far larger logical units, i.e. set based operations.

    The function was for loading external data. The new method is:

    1. pipe the entire set of binary data to the DB in a single step
    2. use XML methods to transform the data into rows and columns (even though data is not XML)
    3. load the data using INSERT INTO ... SELECT FROM ...

    The time taken for step 1 varies based on the volume of data to move. 2 and 3 are always quite fast even up to the million rows I measured.

    Edit: almost forgot - I dropped a completely unnecessary index which was slowing down bulk inserts
     
    Last edited: Dec 23, 2020
    GumbyNoTalent likes this.
  9. GumbyNoTalent

    GumbyNoTalent Member

    Joined:
    Jan 8, 2003
    Messages:
    9,782
    Location:
    Briz Vegas
    LOL had a Postgres DB with embed JSON documents, which some numpty decided to index on.. you guessed it fields within the JSON and fields external to the JSON and wondered why performance was poor.
     
  10. PabloEscobar

    PabloEscobar Member

    Joined:
    Jan 28, 2008
    Messages:
    14,494


    I wonder if Doctors get asked to 'just do this quick surgery, that you will never do in the real world, just to show you can' in their interviews
     
  11. theSeekerr

    theSeekerr Member

    Joined:
    Jan 19, 2010
    Messages:
    3,605
    Location:
    Broadview SA
    Doctors have degrees that mean three tenths of a shit ‍ ¯\_(ツ)_/¯
    In my experience the best candidates for IT roles are not the ones with the right papers.

    The SQL question does its job. People who can do it can do it. People who can't do it start telling me the truth about their experience with SQL. Both are useful outcomes.

    EDIT: If I had a buck for every time I forgot this forum doesn't do Unicode...
     
    Last edited: Dec 23, 2020
  12. wazza

    wazza Member

    Joined:
    Jun 28, 2001
    Messages:
    3,709
    Location:
    NSW
    I'm not a big fan of well known interview questions, as you're just as likely to get people who have looked up "how to fizzbuzz" as you are to get those who have certs without the actual knowledge behind them. I also prefer to use things that they are likely to encounter in the job, so for MSSQL I ask things like show the total value of all orders in the last week from customer X (shows if they use variables for the customer, if they hard code the date or use offsets etc)

    I've done that more times than I'd like to admit, but that's largely when people give me data in excel that won't cleanly go into the DB to do the whole query within, else I'd much rather work entirely within the db itself.

    The number of times I see code from "professional" query writers/DBAs etc that is just absolute shit surprises me - most of the time it's clear that it "worked fine in dev" because they have a single user doing shit to very few records, so it doesn't cause issues. RBAR is meant to be worst case scenario but I see cursors thrown around in the code like they're going out of fashion (hell, I wish they'd go out of fashion!), then they try to "fix" it by throwing in rowlock commands (nice try, doesn't work) or artifically limiting the number of records it works on - one task wasn't quite to stiben's level but a task that runs on a 5min interval that "worked fine in dev" suddenly had issue in the real world where it had decent amounts of data and took longer than 5mins to process (and deadlocked a popular table in the mean time) got cut down to single digit seconds when I rewrote the query....and I'm a bloody sysadmin not a DBA!
     
  13. gav1ski

    gav1ski Member

    Joined:
    Aug 9, 2001
    Messages:
    154
    Location:
    Sydney
    I would not just say code, maybe a complete lack of understanding on how databases get data and how indexes work. Had a great one today where the person linked in about 10 tables that were never used in the output or selection criteria, mixed the table joins up to combine left joins, inner joins, and where statements to effectively turn half the left joins into inner joins, then throws a distinct at the top because they were getting duplicate rows as some of the extra linked tables where on the many side on the 1 to many relationship.

    I then get the call asking can you get this working for me as it not getting all of the expected data out. This is from the person who does most of the data extracts and reports for the company.
     
  14. melatonin

    melatonin Member

    Joined:
    Dec 13, 2012
    Messages:
    621
    Umm what?
     
    NSanity likes this.
  15. PabloEscobar

    PabloEscobar Member

    Joined:
    Jan 28, 2008
    Messages:
    14,494
    All it really tells you, is "has this person interviewed at another stupid company before"
     
  16. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    43,994
    Location:
    Brisbane
    Going back in time a little while to this discussion, Phoronix have taken a deep dive into the real world performance hits of what all of these CPU security issues are costing us:
    https://www.phoronix.com/scan.php?page=article&item=3-years-specmelt&num=1

    Whether we're paying for that on your own tin in wasted throughput, or whether we're paying for that on a cloud vendor in wasted cycles, we *ARE* paying for this in real dollars.

    What's the point of a new instruction set that yields +10% performance just to come bundled with a security flaw that requires a patch that yields -10% performance?
     
    GumbyNoTalent likes this.
  17. cvidler

    cvidler Member

    Joined:
    Jun 29, 2001
    Messages:
    14,921
    Location:
    Canberra
    Linux at least gives the end user the option to disable the mitigations.

    Which considering these attacks can only work if the attacker has code on your system (they're not remotely exploitable), it's an option you can weigh up the need for.

    A cloud vendor who has zero control on what code runs on their systems, they'd 100% keep them patched.

    a home user/gamer will want 100% of the performance and turn them off.

    and of course there's plenty of middle ground there too.
     
  18. OP
    OP
    elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    43,994
    Location:
    Brisbane
    Say, like a bit of code running in your browser?

    Speaking of things running in your browser, who wants a working SMB stack in WASM with a websockets-to-tcp proxy bundled along side it?

    https://twitter.com/SkelSec/status/1346517626026123268

    Are you ready for the next wave of ransomware? 2021 is going to be a hoot!
     
    Hive likes this.
  19. EvilGenius

    EvilGenius Member

    Joined:
    Apr 26, 2005
    Messages:
    10,786
    Location:
    elsewhere
    That stopped being completely true back in July.

    https://www.eweek.com/security/netspectre-attack-could-enable-remote-cpu-exploitation

    That attack yielded 15 bits per hour.. not exactly going to set anyones hair on fire, but they did at least show it was possible.
     
  20. itsmydamnation

    itsmydamnation Member

    Joined:
    Apr 30, 2003
    Messages:
    10,686
    Location:
    Canberra
    Because they aren't 10% they are hundreds of % and they aren't instruction, they are ISA independent, go back to your 486 Elvis.
    see the thing is , even if you clocked a 486 or A55 at 5ghz , 10ghz or more its still going to blow serious balls because all memory access are going to be ~ 70-100 ns range.

    Also lets just ignore that in unaffected hardware most of the performance hit comes from Dev's doing stupid shit the first place, why are Dev's so LAZY!!
     

Share This Page

Advertisement: