Nieuws van politieke partijen inzichtelijk

445 documenten

A Visit to Where the Cloud Touches the Ground

Democraten NU Democraten NU Almelo 01-04-2024 21:33

Hi there! I’m Zander Rose and I’ve recently started at Automattic to work on long-term data preservation and the evolution of our 100-Year Plan. Previously, I directed The Long Now Foundation and have worked on long-term archival projects like The Rosetta Project, as well as advised/partnered with organizations such as The Internet Archive, Archmission Foundation, GitHub Archive, Permanent, and Stanford Digital Repository. More broadly, I see the content of the Internet, and the open web in particular, as an irreplaceable cultural resource that should be able to last into the deep future—and my main task is to make sure that happens.

I recently took a trip to one of Automattic’s data centers to get a peek at what “the cloud” really looks like. As I was telling my family about what I was doing, it was interesting to note their perception of “the cloud” as a completely ephemeral thing. In reality, the cloud has a massive physical and energy presence, even if most people don’t see it on a day-to-day basis.

Automattic’s data center network. You can see a real-time traffic map right here.

Given the millions of sites hosted by Automattic, figuring out how all that data is currently served and stored was one of the first elements I wanted to understand. I believe that the preservation of as many of these websites as possible will someday be seen as a massive historic and cultural benefit. For this reason, I was thankful to be included on a recent meetup for WordPress.com’s Explorers engineering team, which included a tour of one of Automattic’s data centers.

The tour began with a taco lunch where we met amazing Automatticians and data center hosts Barry and Eugene, from our world-class systems and operations team. These guys are data center ninjas and are deeply knowledgeable, humble, and clearly exactly who you would want caring about your data.

The data center we visited was built out in 2013 and was the first one in which Automattic owned and operated its servers and equipment, rather than farming it out. By building out our own infrastructure, it gives us full control over every bit of data that comes in and out, as well as reduces costs given the large amount of data stored and served. Automattic now has a worldwide network of 27 data centers that provide both proximity and redundancy of content to the users and the company itself.

The physical building we visited is run by a contracted provider, and after passing through many layers of security both inside and outside, we began the tour with the facility manager showing us the physical infrastructure. This building has multiple customers paying for server space, with Automattic being just one of them. They keep technical staff on site that can help with maintenance or updates to the equipment, but, in general, the preference is for Automattic’s staff to be the only ones who touch the equipment, both for cost and security purposes.

The four primary things any data center provider needs to guarantee are uninterruptible power, cooling, data connectivity, and physical security/fire protection. The customer, such as Automattic, sets up racks of servers in the building and is responsible for that equipment, including how it ties into the power, cooling, and internet. This report is thus organized in that order.

On our drive in, we saw the large power substation positioned right on campus (which includes many data center buildings, not just Automattic’s). Barry pointed out this not only means there is a massive amount of power available to the campus, but it also gets electrical feeds from both the east and west power grids, making for redundant power even at the utility level coming into the buildings.

The data center’s massive generators.

One of the more unique things about this facility is that instead of battery-based instant backup power, it uses flywheel storage by Active Power. This is basically a series of refrigerator-sized boxes with 600-pound flywheels spinning at 10,000 RPM in a vacuum chamber on precision ceramic bearings. The flywheel acts as a motor most of the time, getting fed power from the network to keep it spinning. Then if the power fails, it switches to generator mode, pulling energy out of the flywheel to keep the power on for the 5-30 seconds it takes for the giant diesel generators outside to kick in.

Those generators are the size of semi-truck trailers and supply four megawatts each, fueled by 4,500-gallon diesel tanks. That may sound like a lot, but that basically gives them 48 hours of run time before needing more fuel. In the midst of a large disaster, there could be issues with road access and fuel shortages limiting the ability to refuel the generators, but in cases like that, our network of multiple data centers with redundant capabilities will still keep the data flowing.

Depending on outside ambient temperatures, cooling is typically around 30% of the power consumption of a data center. The air chilling is done through a series of cooling units supplied by a system of saline water tanks out by the generators.

Barry and Eugene pointed out that without cooling, the equipment will very quickly (in less than an hour) try to lower their power consumption in response to the heat, causing a loss of performance. Barry also said that when they start dropping performance radically, it makes it more difficult to manage than if the equipment simply shut off. But if the cooling comes back soon enough, it allows for faster recovery than if hardware was fully shut off.

Handling the cooling in a data center is a complicated task, but this is one of the core responsibilities of the facility, which they handle very well and with a fair amount of redundancy.

Data centers can vary in terms of how they connect to the internet. This center allows for multiple providers to come into a main point of entry for the building.

Automattic brings in at least two providers to create redundancy, so every piece of equipment should be able to get power and internet from two or more sources at all times. This connectivity comes into Automattic’s equipment over fiber via overhead raceways that are separate from the power and cooling in the floor. From there it goes into two routers, each connected to all the cabinets in that row.

As mentioned earlier, this data center is shared among several tenants. This means that each one sets up their own last line of physical security. Some lease an entire data hall to themselves, or use a cage around their equipment; some take it even further by obscuring the equipment so you cannot see it, as well as extending the cage through the subfloor another three feet down so that no one could get in by crawling through that space.

Automattic’s machines took up the central portion of the data hall we were in, with some room to grow. We started this portion of the tour in the “office” that Automattic also rents to both store spare parts and equipment, as well as provide a quiet place to work. On this tour it became apparent that working in the actual server rooms is far from ideal. With all the fans and cooling, the rooms are both loud and cold, so in general you want to do as much work outside of there as possible.

What was also interesting about this space is that it showed all the generations of equipment and hard drives that have to be kept up simultaneously. It is not practical to assume that a given generation of hard drives or even connection cables will be available for more than a few years. In general, the plan is to keep all hardware using identical memory, drives, and cables, but that is not always possible. As we saw in the server racks, there is equipment still running from 2013, but these will likely have to be completely swapped in the near future.

Barry also pointed out that different drive tech is used for different types of data. Images are stored on spinning hard drives (which are the cheapest by size, but have moving parts so need more replacement), and the longer lasting solid state disk (SSD) and non-volatile memory (NVMe) technology are used for other roles like caching and databases, where speed and performance are most important.

Barry showing us all the bins of hardware they use to maintain the servers.

Barry explained that data at Automattic is stored in multiple places in the same data center, and redundantly again at several other data centers. Even with that much redundancy, a further copy is stored on an outside backup. Each one of the centers Automattic uses has a method of separation, so it is difficult for a single bug to propagate between different facilities. In the last decade, there’s only been one instance where the outside backup had to come into play, and it was for six images. Still, Barry noted that there can never be too many backups.

And with that, we concluded the tour and I would soon head off to the airport to fly home. The last question Barry asked me was if I thought this would all be around in 100 years. My answer was that something like it most certainly will, but that it would look radically different, and may be situated in parts of the world with more sustainable cooling and energy, as more of the world gets large bandwidth connections.

As I thought about the project of getting all this data to last into the deep future, I was very impressed by what Automattic has built, and believe that as long as business continues as normal, the data is incredibly safe. However, on the chance that things do change, I think developing partnerships with organizations like The Internet Archive, Permanent.org, and perhaps national libraries or large universities will be critically important to help make sure the content of the open web survives well into the future. We could also look at some of the long-term storage systems that store data without the need for power, as well as systems that cannot be changed in the future (as we wonder if AI and censorship may alter what we know to be “facts”). For this, we could look at stable optical systems like Piql, Project Silica, and Stampertech. It breaks my heart to think the world would have created all this, only for it to be lost. I think we owe it to the future to make sure as much of it as possible has a path to survive.

Our group of Automatticians enjoyed the tour—thank you Barry and Eugene!

Missing out on the latest WordPress.com developments? Enter your email below to receive future announcements direct to your inbox. An email confirmation will be sent before you will start receiving notifications—please check your spam folder if you don't receive this.

WP Cloud Is Powering the Future of WordPress

Democraten NU Democraten NU Almelo 07-03-2024 15:00

The foundational infrastructure for the websites you build and manage is crucial for ensuring a safe, secure, fast, and reliable environment. That’s where WP Cloud comes in.

Automattic, the parent company of WordPress.com, built WP Cloud because we wanted a cloud platform constructed from the ground up just for WordPress. We’ve hosted millions of websites across the WordPress ecosystem and have become one of the most trusted providers in cloud services.

We’re proud of WP Cloud’s 99.999% uptime, automated burst scaling and failure detection, and failover redundancies that allow you to spend time focusing on building your business or serving your clients instead of worrying about whether a traffic spike will crash the site.

WP Cloud is also incredibly secure. With DDoS protection, malware scanning, anti-spam measures, SSL certificates, TLS traffic encryption, and real-time backups, you’ll have peace of mind from day one.

We’re confident that there’s no better cloud platform for your WordPress site(s) than WP Cloud. And we’re not the only ones to think so.

Today, WP Cloud is announcing that Bluehost—one of the largest website hosts in the world—is launching a new product built atop WP Cloud’s best-in-class infrastructure.

Bluehost Cloud includes all the technical excellence of WP Cloud, with bundled options for hosting multiple websites. Plus, as with all of the sites on WordPress.com, it comes with Jetpack’s highly acclaimed performance and security features built right in.

To kick off this partnership, we’re showcasing Bluehost Cloud on WordPress.com’s pricing page, so that you can choose the product that best fits your business needs. As fellow supporters of the WordPress ecosystem, we’re glad Bluehost has chosen WP Cloud for this powerful new offering.

Missing out on the latest WordPress.com developments? Enter your email below to receive future announcements direct to your inbox. An email confirmation will be sent before you will start receiving notifications—please check your spam folder if you don't receive this.

Schermgeluk of misschien toch niet…

Gemeentebelang Gemeentebelang Nunspeet 24-02-2024 05:00

U kent het vast, jongeren, maar ook ouderen, turend op een schermpje. Zich er totaal niet van bewust van wat er zich in de wereld om hen heen afspeelt.

Zelf maak ik me er ook wel eens schuldig aan en tegelijkertijd maar ik me zorgen over deze ontwikkeling. Soms lijkt het alsof er alleen nog maar gecommuniceerd wordt middels apparaten, schermpjes, digitale systemen en lijkt een normaal gesprekje als lastig te worden ervaren. Digitalisering en de effecten op het welzijn van kinderen en jongeren zijn zowel positief als negatief.

Digitalisering verandert onze samenleving in een hoog tempo. Regie daarop vanuit het openbaar bestuur is niet eenvoudig, je zou willen dat de informatiesamenleving zich op een maatschappelijk verantwoorde manier ontwikkelt, maar hoe dat dan zou moeten is een ingewikkeld vraagstuk en misschien nog ongrijpbaar.

Digitale technologie verbindt de wereld verder met elkaar, maar het maakt de wereld ook complexer. Doordat ontwikkelingen zich in hoog tempo voordoen, is het soms lastig om het overzicht te behouden en het in goede banen te leiden.

Bij de maatschappelijke vraagstukken en de uitdagingen waar we vandaag de dag voor staan, zoals de energietransitie, de woonopgave en de bestrijding van armoede, hebben we de digitale technologie wel nodig, maar hoe houden we de inzet van AI veilig? Kunnen we schermverslaving onder jonge( en oudere) mensen tegenhouden? En hoe wordt, vanuit het Gemeentebelang-perspectief gezien, geautomatiseerde dienstverlening onderdeel van de gemeentelijke organisatie zonder de menselijke maat te verliezen? Een enorme uitdaging waar we de komende jaren en ook nu mee te maken hebben en krijgen. Een vraagstuk waarvoor we over de hele breedte van de maatschappij geen pasklaar antwoord op kunnen geven.

Mijn persoonlijke wens; Laten we, op het persoonlijke vlak, profiteren van het moois wat de digitale wereld ons kan brengen, maar laten we proberen ons schermpje wat vaker in de kontzak of tas te laten zitten. Het gesprekje weer aan te gaan met degene naast je, jong en/of oud, in de wachtkamer bij de dokter, in de bus of in de trein en langs het veld op zaterdag. De verbinding blijven zoeken met elkaar kleurt de wereld, hoe klein ook, een stukje mooier.

En als we toch online zijn, dan zijn we met zijn allen verantwoordelijk om het gezellig te houden in die digitale wereld.

Daniëla Koster

Arjen Vroegop in de Uitkijkpost

VVD VVD Heiloo 15-02-2024 02:22

Ik lees via de socials soms wel heel sterke uitlatingen en meningen. Als ik ergens wat van vind, zet ik mij er ook voor in. Daarom ben ik ook van jongs af aan politiek actief. Liberaal omdat ik geloof in mensen en hecht aan vrijheid, verantwoordelijkheid en sociale rechtvaardigheid. En in de raad omdat er zoveel moois te doen is in Heiloo.

3.    Wat wil je de komende periode voor Heiloo bereiken?

Meer perspectief op sport, cultuur, goede aanpak verkeersknelpunten, klimaatmaatregelen, Stationsgebied en Looplein. In contact blijven met bewoners, organisaties en ondernemers over actuele kwesties zoals verkeer in Heiloo, openbare orde & veiligheid en waterproblematiek in Zuiderloo. Naast natuurlijk afslag A9, en bouwen in Zandzoom en andere locaties. Daarbij: aandacht houden voor realiteit. Laat de idealistische vergezichten maar voorbij drijven. Dat deden ze immers al.

4.    Wat doe je naast je raadlidmaatschap?

Ik werk bij de directie Toeslagen waar we via goede dienstverlening graag iedereen de toeslag geven waar hij/zij recht op heeft. Dit om te voorkomen dat er terugbetaald moet worden. Verder help ik als coach / adviseur mensen en organisaties om richting te geven aan voor hen belangrijke vraagstukken over professionele ontwikkeling, juridische kwesties en organisatieverandering.

5. Heb je al een bijzonder moment in de raad meegemaakt?

De opening van het nieuwe gemeentehuis was een mooi moment. Wij gaan als gemeente niet op een industrieterrein on-line bereikbaar zitten zijn, maar zorgen ervoor dat we zichtbaar midden in het dorp zitten. En daarbij ook de bieb, het  VIP en het social café toekomst bieden.

6. Wat wil je nog verder delen met de inwoners van Heiloo?

Blijf een beetje mild voor elkaar, want we hebben elkaar nodig. En als je ergens mee zit wees dan welkom om met ons mee te denken over hoe we Heiloo een mooie Liberale toekomst kunnen blijven bieden. Zie https://heiloo.vvd.nl/  of mail ons via vvdheiloo@outlook.com . We gaan graag met u in gesprek. Bellen kan ook.

 

20 april 2024: Haya-training Media | Interviews

VVD VVD Heumen 01-02-2024 08:05

Op 20 april organiseren we een trainingsdag waarbij een mediatraining wordt georganiseerd. De training is onderdeel van het opleidingsaanbod van de Haya van Somerenstichting en wordt begeleid door Eric Trinthamer. Zelf oefenen en leren van en met anderen, daar is waar het om draait. 

De training is in eerste plaats bedoeld voor politiek actieve leden (raadsleden, fractievolgers). Ben je nog niet politiek actief, maar heb je interesse in de training, neem dan vooral contact op om de mogelijkheden te bespreken. 

De trainingsdag start om ca. 10.00 uur in omgeving Nijmegen (locatie volgt). Er wordt tegemoetkoming in de kosten gevraagd, maar daarvoor krijg je wel een mooie training! Je kunt je inschrijven via mijnvvd of via mail aan ons bestuurslid Max Knegjens (mknegjens@me.com). 

Bringing You a Faster, More Secure Web: HTTP/3 Is Now Enabled for All Automattic Services

Democraten NU Democraten NU Almelo 31-01-2024 18:12

HTTP/3 is the third major version of the Hypertext Transfer Protocol used to exchange information on the web. It is built on top of a new protocol called QUIC, which is set to fix some limitations of the previous protocol versions. Without getting into technical details—though feel free to do so in the comments if you have questions—our users should see performance improvements across all metrics:

Reduced latency. Due to faster connection establishment (i.e. fewer round-trips), latency from connection setup is lower.

Multiplexing. That is, using a single connection for multiple resources. While this feature is present in HTTP/2, HTTP/3 has improved on it and fixed a problem called “head of line blocking.” This is a deficiency of the underlying protocol HTTP/2 was built on top, which requires packets to be in order before relaying them for processing.

Reliability. Designed to perform better in varying network environments, HTTP/3 uses modern algorithms to help it recover faster from lost data and busy networks.

Improved security. QUIC uses the latest cryptography protocols (TLSv1.3) to encrypt and secure data. More of the data is encrypted, which makes it harder for an attacker to tamper with or listen in on web requests.

Ultimately, HTTP/3 (on top of QUIC) has been designed to be updated in software, which allows for quicker improvements that don’t depend on underlying network infrastructure.

After about a month of preparing our infrastructure—including fixing bugs and upgrading our CDN—HTTP/3 was enabled for all of Automattic’s services on December 27th, 2023. It currently serves between ~25-35% of all traffic.

And now for some stats. For each of these, we want numbers to be lower after the switch, which ultimately means faster speeds across the board for our customers. Let’s look at three metrics in particular:

Time to First Byte (TTFB) measures the time between the request for a resource and when the first byte of a response arrives.

Largest Contentful Paint (LCP) represents how quickly the main content of a web page is loaded.

Last Resource End (LRE) measures the time between the request for a resource and when the whole response has arrived.

Results for fast connections—low latency and high bandwidth

Improvements look pretty good for fast connections:

TTFB: 7.3%

LCP: 20.9%

LRE: 24.4%

Results for slow connections—high latency or low bandwidth

For slow connections, the results are even better:

TTFB: 27.4%

LCP: 32.5%

LRE: 35%

We are dedicated to providing our customer’s websites with the best possible performance. Enabling HTTP/3 is a step in that direction. See you on the QUIC side!

Automattic’s mission is to democratize publishing. To accomplish that, we’re hiring systems engineers to join the best infrastructure team on the planet. Learn more here.

Missing out on the latest WordPress.com developments? Enter your email below to receive future announcements direct to your inbox. An email confirmation will be sent before you will start receiving notifications—please check your spam folder if you don't receive this.

VVD Teylingen organiseert High Beer om 2024 af te trappen!

VVD VVD Teylingen 02-01-2024 04:28

Samen trappen we 2024 af onder het genot van drie (0.0) speciaalbieren (of fris/wijn) en verschillende hapjes! Iedereen die het leuk vind mag komen, niet alleen onze leden. Neem dus vooral uw vrienden en familie mee!

https://teylingen.vvd.nl/nieuws/54612/vvd-teylingen-organiseert-high-beer-om-2024-af-te-trappen

21 januari om 14:00 

Eetcafe de Voogd in Sassenheim

Hoofdstraat 267B

2171 BD, Sassenheim

De kosten voor de High Beer bedragen 27.50 euro per persoon. Aanmelden kan via de volgende link: 

https://docs.google.com/forms/d/e/1FAIpQLSeoH4elZiDeP-vSTXLIX44wtMP1_TpUiYcvdQf7cxWT-jVBaA/viewform

Zet cookies aan om het formulier te tonen.

Winterwandeling

Open-Groen-Progressief Midden-Delfland (OGP) Open-Groen-Progressief Midden-Delfland (OGP) Midden-Delfland 23-12-2023 15:33

Twijfel niet met vragen, maar neem direct contact met ons op. U kunt ons eenvoudig bereiken via de contactpagina van deze site. Maar u kunt de een van de raadsleden ook direct bellen, mailen of aanspreken. Kijk op de fractiepagina. Wij beantwoorden uw vragen graag!

Kerst en Nieuwjaarsgroet

Open-Groen-Progressief Midden-Delfland (OGP) Open-Groen-Progressief Midden-Delfland (OGP) Midden-Delfland 23-12-2023 15:31

Twijfel niet met vragen, maar neem direct contact met ons op. U kunt ons eenvoudig bereiken via de contactpagina van deze site. Maar u kunt de een van de raadsleden ook direct bellen, mailen of aanspreken. Kijk op de fractiepagina. Wij beantwoorden uw vragen graag!

Zie je content die volgens jou niet op deze site hoort? Check onze disclaimer.