Windows Updates Thread - June 2017 - XP Patches and Broken Office 2007/2010

Discussion in 'Business & Enterprise Computing' started by PabloEscobar, Dec 11, 2014.

  1. XtehseA

    XtehseA Member

    Joined:
    Aug 15, 2006
    Messages:
    140
  2. looktall

    looktall Working Class Doughnut

    Joined:
    Sep 17, 2001
    Messages:
    23,324
    Location:
    brabham.wa.au
    https://technet.microsoft.com/en-us/library/security/ms17-jan.aspx

    how the hell did they manage to do that?
     
  3. freaky_beeky

    freaky_beeky Member

    Joined:
    Dec 2, 2004
    Messages:
    950
    Location:
    Brisbane
  4. looktall

    looktall Working Class Doughnut

    Joined:
    Sep 17, 2001
    Messages:
    23,324
    Location:
    brabham.wa.au
    i'm fully expecting there to be several new patches released for those operating systems just as soon as i've finished creating my new images.
     
  5. freaky_beeky

    freaky_beeky Member

    Joined:
    Dec 2, 2004
    Messages:
    950
    Location:
    Brisbane
    But you have ADRs configured to automatically patch your WIMs right? So it shouldn't really matter.
     
  6. looktall

    looktall Working Class Doughnut

    Joined:
    Sep 17, 2001
    Messages:
    23,324
    Location:
    brabham.wa.au
    unfortunately not.
    i have ADR's configured to patch existing machines, but i'm not able to use SCCM for image deployment.
    we just don't have the bandwidth.

    we do builds offline using MDT.
    the image is built, WSUS runs and then the image is syprep'd and captured for deployment via MDT.
     
  7. elvis

    elvis Old school old fool

    Joined:
    Jun 27, 2001
    Messages:
    29,975
    Location:
    Brisbane
  8. Dukey

    Dukey Member

    Joined:
    Dec 22, 2002
    Messages:
    328
  9. freaky_beeky

    freaky_beeky Member

    Joined:
    Dec 2, 2004
    Messages:
    950
    Location:
    Brisbane
    Hows does MDT save you bandwidth compared to having it in SCCM? I remember reading about your network topology before, but can't seem to think of it now.
     
  10. looktall

    looktall Working Class Doughnut

    Joined:
    Sep 17, 2001
    Messages:
    23,324
    Location:
    brabham.wa.au
    the links between our sites aren't particularly high.
    pushing content out to the DP's from the primary site server is a slow process that is often needed to be done out of standard business hours (and considering most of our sites run 24/7 even that can have a large impact).

    instead we have a dedicated build server at about 8 sites configured with MDT, WSUS etc.
    the images are built on and deployed from those servers to their respective sites, by the local onsite IT support.

    drivers complicate matters too.
    we don't run a fleet pc's consisting of only 2-3 models, that get replaced every x years.
    instead it is a lists of various models as long as your arm dating back to the dawn of time.
    managing the driver content on the DP's would be a fucking nightmare.
    instead the drivers are managed on the MDT build servers by the local onsite IT support.
     
  11. freaky_beeky

    freaky_beeky Member

    Joined:
    Dec 2, 2004
    Messages:
    950
    Location:
    Brisbane
    I think if you looked at the situation again you might reconsider your approach (depending on how many models you really have and how long it would take to test them).

    If you enable binary differential replication on your WIM file you may be surprised at how little information ends up being transferred to your remote sites, (although this definitely increases the compute load at the other end).

    Management of your WSUS/SUP would be very similar to what it is now.

    The issue of requiring updates to content (pushing out things from the "central DP or DPs to remote DPs) can be managed all via the Rate Limits tab within SCCM. You could throttle during "peak times" (or disable it completely, which is the option I have taken in most sites) and increase the rate for "off peak" times.

    The drivers would be the hardest part, but with SCCM 2012, the driver management is *significantly* easier than 2007 (if that's what has raised your concern). You could just import them all into SCCM 2012 and just create a per site package (temporarily until you work out how best to organise them). Then you could just use an Auto Apply Driver step, rather than what I assume you're doing, which is apply each "models" or each "sites" driver pack to their respective build.

    These driver packages can also be deployed to remote sites with binary differential compression to minimise the impact to your links.

    I imagine if you started to centrally mange your drivers, you would find that it actually isn't that hard to mange for wide and varying degrees of hardware. For example if you split the categories into video, system, network etc... If you pull down the current versions for the major vendors I think you will find it easy to manage new version upgrades, as well as reduce the overall size of the driver store. The tough part of finding the odds and bobs for weird devices around the place has already been done for you, you simply need to find which of these (from you collection of drivers from sites) are actually required.

    Some of the benefits of this system would be the automated deployment of updates to all your machines centrally, the automated and centralised maintenance of your "SOE" WIM and the reduced overhead from having to manage all the individual instances, not to mention the centralised reporting of it all.

    I don't expect that you'll do this, but just thought it might be some food for thought. :)
     
  12. looktall

    looktall Working Class Doughnut

    Joined:
    Sep 17, 2001
    Messages:
    23,324
    Location:
    brabham.wa.au
    thanks for the advice and you're correct i won't be doing this but primarily because i'm looking for a new job elsewhere so i am disinclined to put any effort into the current situation.

    hopefully wherever i end up working next has an existing setup in place that i can maintain and/or improve on.
    if not, perhaps then i can look into building it all up from scratch in a better fashion than what i currently have.
     
  13. freaky_beeky

    freaky_beeky Member

    Joined:
    Dec 2, 2004
    Messages:
    950
    Location:
    Brisbane
    I've worked in quite a number of environments now, and I can't recall walking into one and thinking it couldn't use some improvements. With regards to SCCM specifically, it's almost like every business either Next Wizards the install, or gets an MSP to deploy a cookie cutter mould to their environment leading to all kinds of unreliability and general awkwardness. I'm sure if you find yourself a new environment you'll have plenty of work to do! (Best of luck on the job search front too)
     
  14. tin

    tin Member

    Joined:
    Jul 31, 2001
    Messages:
    6,396
    Location:
    Narrabri NSW
    Not very businessy, but last nights Win10 updates broke my "proper" sound card (an old SB X-Fi). The cheap Realtek HDA kept going though, so most business stuff should be fine based on that one anecdote.... :D
    Fix was to just remove the device in device manager (chose to delete the driver too - not sure if that was necessary). Worked straight away without reboot.

    I'm starting to get sick of these "let's break shit" updates. Not looking forward to the update labelled "You didn't need that video card, right? (KB8403887762)".
     
  15. PabloEscobar

    PabloEscobar Member

    Joined:
    Jan 28, 2008
    Messages:
    9,917
  16. freaky_beeky

    freaky_beeky Member

    Joined:
    Dec 2, 2004
    Messages:
    950
    Location:
    Brisbane
    Root Certification Authorities - Missing Certs

    Hey All,

    Since January (as a result of what I presume was Microsoft's SHA1 expiration) any PC that we image up with our previously working Windows 10 build is missing a huge number of certificates from the Trusted Root Certification Authorities and Third-Party Root Certification Authorities. I've had a look on some of the older machines and confirmed they were signed with SHA1, but thought that they would be exempt.

    We've disabled the following Group Policy:
    Code:
    Computer Configuration/Administrative Templates/System/Internet Communication Management/Internet Communication settings
    Turn off Automatic Root Certificates Update
    
    Which relates to the following reg key
    Code:
    HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\SystemCertificates\AuthRoot\DisableRootAutoUpdate
    
    This allows the certificate store to be updated from Microsoft via Windows Updates. This starts populating the store with certificates as you hit them, e.g. navigating to various websites adds the root cert to the trusted store, however it does not return them all to the store and is causing other issues for us.

    The major issue is that it is preventing our internally hosted PAC file from functioning as expected, despite being hosted on HTTP (rather than HTTPS).

    Anyone else run into this? I'm assuming this should be a relatively easy fix, however I just can't seem to work it out.
     
  17. PabloEscobar

    PabloEscobar Member

    Joined:
    Jan 28, 2008
    Messages:
    9,917
    May not be relevant, but I did notice

    Addressed issue that loads websites that bypass the proxy server in the local intranet zone when the Intranet Sites: Include all sites that bypass the proxy server (Disabled) is set.

    In the patch notes for the latest build, We've had some strangeness with what Windows determines as a "Local" site (anything with a (.) in it, was classed as not local). G fucking G.
     
  18. rainwulf

    rainwulf Member

    Joined:
    Jan 20, 2002
    Messages:
    3,907
    Location:
    bris.qld.aus
    Ran into that myself too. Thought it was bizarre. Had to pull a copy of the security zone information from a working machine.
     
  19. freaky_beeky

    freaky_beeky Member

    Joined:
    Dec 2, 2004
    Messages:
    950
    Location:
    Brisbane
    Thanks for the suggestion. I used the short name rather than the FQDN for my PAC file path but unfortunately that had not effect.

    If I image a new machine now, I cannot navigate to any website unless I explicitly specify the proxy rather than use the (working everywhere else) auto-configuration PAC.

    Strangely enough if I leave the home page to load for an excessively long time (minutes) it will eventually load up as long as I have the aforementioned Group Policy configured to allow it pull down the certificate. This websites parent domain is also in the Local Intranet Zone via group policy as well. :mad:
     
  20. looktall

    looktall Working Class Doughnut

    Joined:
    Sep 17, 2001
    Messages:
    23,324
    Location:
    brabham.wa.au
    What if you use an IP address for your PAC file path?
     

Share This Page