Tigers improve to 5-3 in ValleyBy Paul LeckerSports ReporterMARSHFIELD — With matches looming against the top two teams in the Wisconsin Valley Conference on Saturday, the Marshfield volleyball team wanted to make sure it took care of the business at hand Thursday.The Tigers started all three games strong and went on to beat Wisconsin Rapids 25-14, 25-11, 25-10 in a WVC match Oct. 6 at Marshfield High School.Marshfield is now 5-3 heading into the second conference tournament on Saturday at D.C. Everest. The Tigers will play first-place Everest and second-place Merrill as they look to make a push up the standings with only one week remaining in the regular season.Marshfield senior Stephanie Rhodes had eight kills in the victory, pushing her career total to 1,001, a milestone that coach Dawn Sadowska said was well-earned.“She’s been on varsity since she was a freshman, and year after year she’s been getting better,” Sadowska said. “We’ve asked more and more of her. She’s such a versatile hitter, and this year her blocking has been better than it ever has been for us.”Anna Ripp, who passed the 2,000-assist mark in her career last week, started the first game by serving six points in a row — three were aces — to push Marshfield out to a quick 7-1 lead.The Tigers finished just as strong, scoring the final eight points of the set, the final point coming on a Jamila Ougayour kill.Ripp had two aces to help Marshfield build a 5-0 lead to start Game 2 and served eight-straight points to begin the third set as the Tigers ran out to a 9-0 advantage. Marshfield was never challenged in either game.Ripp finished with 28 assists and six service aces, Alexa Aumann had a team-high 11 kills, and Maureen Cassidy had 15 digs to top Marshfield. The Tigers finished with 11 service aces overall.“Our plan was to serve aggressive, and we’ve been working on trying to run things faster,” Sadowska said. “I said every free ball, ‘We have to go quick because that’s our plan for this weekend,’ and they did a really good job of that.”(Hub City Times Sports Reporter Paul Lecker is also the publisher of MarshfieldAreaSports.com.)
12 May 2008After seven wins on South African soil Hennie Otto finally broke his European Tour duck by scoring an outstanding victory in the Italian Open at the Castello di Tolcinasco Golf and Country Club in Milan on Sunday.Towards the end of March, Otto had narrowly missed out on his first European Tour win; he took a five-shot lead he took into the final round of the Madeira Islands Open, but ended up in a playoff against Alastair Forsyth, which the Scot won.Stunning formOn Sunday, the circumstances were similar to six weeks earlier in Portugal; Otto enjoyed a four-shot lead heading into the final round.He had been superb through the first three rounds; after opening with a seven-under-par 65, he turned in a strong 66, followed by a scintillating nine-under 63 in the third round for a stunning three-round total of 22-under-par 194. That left him only one shot behind the European Tour record for three rounds, jointly shared by South Africa’s Ernie Els and David Howell of England.Given his form in the first three rounds, it was a solid bet that Otto would take victory. It duly came, but it didn’t come easily.Early on, it looked as if the South African would cruise to victory after he increased his lead to five shots by sinking birdies on three of the first five holes. England’s Oliver Wilson was on the charge, however, and put Otto under pressure.Wilson on the chargeAfter his opening birdie blitz, Otto played six holes to par, but then dropped a shot on the twelfth. Wilson, meanwhile, was four-under through the front nine and then nailed four birdies in succession from the eleventh to the fourteenth.The Englishman settled for par on the last four holes to finish with a fine eight-under-par 64 and a total of 264.After dropping a shot at the twelfth, Otto immediately pulled one back with a birdie on the thirteenth. In much the same manner as Wilson, he played to par the rest of the way to complete his round in three-under-par 69 for a 25-under total of 263.His total was the lowest in the Italian Open since it was added to the European Tour, bettering the 265 recorded by Francesco Molinari in 2006. It was also the lowest winning total so far on the Tour this season.Useful experienceAfterwards, reflecting on his win, the 31-year-old from Boksburg said his disappointment at the Madeira Islands Open helped him stay focused when he saw his lead slipping away. In the end, he played a superb drive on the final hole, with one foot in a bunker, onto the green, to ensure he captured his maiden title in Europe.He felt his putting had let him down in Portugal. This time around, he said, his putting, especially from distance, is what won him the tournament.Otto’s winnings totalled €283 330, which when converted to rands is in excess of R3.33-million. As importantly as the winning purse, his victory earned him an exemption on the European Tour until the end of 2010; it is a bonus that is hard to put a price on.World rankingOtto entered the Italian Open ranked 154th in the world. After his win he is now ranked 76th, a massive rise of 78 places. His is one of nine South Africans ranked within the top 100.Ernie Els tops the list, in third, followed by Rory Sabbatini, in fourteenth, and US Masters champion Trevor Immelman in sixteenth. Retief Goosen is in 30th place, Richard Sterne is ranked 39th, and Tim Clark 45th. Louis Oosthuizen follows, in 74th spot, with Otto two places behind him, and Charl Schwartzel in 85th place.Italian Open LeaderboardHennie Otto (RSA) 263 (-25)Oliver Wilson (Eng) 264 (-24)Robert Karlsson (Swe) 265 (-23)Philip Archer (Eng) 267 (-21)Marcel Siem (Ger) 267 (-21) South AfricansDavid Frost (RSA) 276 (-12)Charl Schwartzel (RSA) 280 (-8) Would you like to use this article in your publicationor on your website?See: Using SAinfo material
Phase Two of Nedbank’s head office is a certified green building. The Rukinga community has benefitedfrom Nedbank’s purchase of carbon credits. (Images: Nedbank) MEDIA CONTACTS • Elizabeth FlorencioNedbank Group Communications+27 11 295 7260 or +27 83 636 7002RELATED ARTICLES • Nedbank goes carbon neutral • Plastic-bag billboard a world first • Bank honours local heroes • Capitec gets nod from SwissJanine ErasmusSouth African banking group Nedbank announced in July 2010 that it has achieved carbon neutrality – becoming the first bank in Africa to do so.The group’s chief executive Mike Brown made the announcement at the Nedbank head office in Sandton, in Johannesburg’s northern suburbs.“This achievement sees us following through on the commitment we made in September 2009 to completely offset our carbon footprint,” said Brown. “As our group executive of enterprise, governance and compliance, Selby Baqwa, likes to say: there is no Planet B.”Nedbank is now green not only in corporate colour, but also in deed. Brown said that it made sound business sense to go green, but was also very much the right thing to do.“As a large business we have to strike a balance between economic, social, cultural and environmental issues,” he said. “An overt focus on one area can be successful only in the short term. In order to achieve sustainable returns, all of Nedbank’s interdependent parts must be in balance.”The mammoth achievement involved not only the group head office, but all the regional buildings as well as some 500 branches and kiosks across the country. It also took the support of 90% of the staff, who were equipped with the knowledge and tools to make a difference at the office, and then to apply the concept in their own homes.Phase Two of Nedbank’s head office, now complete, is the first four-star green-rated commercial building in the country. It’s been certified by the Green Building Council of South Africa and operates 30% more efficiently than a normal building.Positive impact on communitiesThe Nedbank group made the commitment to go green as many as 25 years ago, determined to make a positive impact on the communities it serves and set an example for its customers and suppliers in the process.Brown believes that the bank’s integrated approach was the key to achieving a zero carbon footprint.“We didn’t achieve it merely by going out and buying up carbon credits, but by a genuine and substantial reduction in our carbon footprint. We started by measuring our impact on the environment – this process was initiated back in 2007 – and now we have a robust measuring system in place.”To ensure integrity in the measuring process and set up the required framework, the bank’s carbon neutral task team worked closely with the local chapter of the World Wide Fund for Nature (WWF), Brown said.“The support from the bank’s leadership is particularly encouraging,” said Richard Worthington of the WWF, “and it has contributed to the bank’s success in its green endeavours. Business and industry in South Africa is generally not doing well in this sphere, and we appreciate Nedbank’s role in setting the pace for others to follow.”Once an accurate measurement had been recorded and then verified by Ernst & Young, a target was set in terms of behaviour change; more efficient operation; and cutting down on paper, electricity, water, travel and waste.“Once we had achieved those reductions that we’d committed to, we offset the remaining unavoidable greenhouse gas emissions, around 213 000 tons, with carbon credits.”After much research, the bank settled on the sustainable Rukinga project in Kenya to obtain its carbon credits. Rukinga is Africa’s first large-scale gold level-accredited Reduced Emissions from Deforestation and Degradation initiative.“We took great care in choosing our beneficiary,” said Brown. “We analysed projects involving biodiversity, renewable energy, and the reduction of deforestation. Unfortunately we couldn’t find a suitable project in South Africa, but other projects have sprung up in the meantime and in the future we’ll be able to secure a more diverse portfolio here at home and elsewhere in Africa.”The Rukinga initiative, situated on 75 000 acres of land in the Kasigau Corridor, focuses on reducing deforestation and degradation of wildlife habitats, eliminating about 3.5-million tons of carbon dioxide emissions, and providing residents with a conservation-related income.The community has benefited richly from its Climate Community and Biodiversity Alliance-approved programme, receiving 18 new classrooms, a boost to its citrus industry, and better healthcare. A large amount of land was reclaimed for wildlife and numerous jobs were created.A journey of many stepsNedbank’s achievement of carbon neutrality took many steps, both big and small. It involved the extensive use of compact fluorescent lamps and motion-controlled lighting, turning off appliances when not in use, encouraging employees to use the stairs and not the lifts, using biodegradable cleaning materials, and installing water filters on taps instead of supplying bottled water.In addition, the bank only uses environment-friendly paper and recycles as much as it can. In 2009 it recycled 124 tons of paper and 34 tons of other waste.“The electricity we saved during 2009 could power 317 houses,” said Brown.Video- and teleconferencing helped to cut down on travel costs, resulting in a 30% reduction in the number of flights taken by Nedbank employees during 2009 and a 34% reduction in car rental.“We are also flexible with working hours and don’t mind if our people want to work from home from time to time – this has saved about 3.5-million kilometres of travel.”In total, said Brown, the bank saved R28-million (US$3.7-million) during 2009. This figure contrasts favourably with the R2-million ($264 398) spent on putting the measuring and monitoring system into place, and the R12-million ($1.6-million) spent on carbon credits.“This is not the end,” said Brown. “Rather, it marks the beginning of an ongoing process in greenhouse gas reduction. We aim to address the issue of water conservation in South Africa, as this is a key sustainability issue for us.”Nedbank is also already working on a number of innovative green retail products for its clients. The bank intends to use its leadership position to encourage all corporates in South Africa to operate in a responsible manner and play their part in mitigating climate change.
South African financial services company Ubank is distributing 3 000 pairs of school shoes to learners from seven schools from the Eastern Cape, Free State, Limpopo and North West provinces as part of it’s ‘Back to School’ campaign, with the latest recipients being Matlhaleng Secondary School and Kgorathuto High School.Matlhaleng Secondary School is situated in the township of Kanana, outside Klerksdorp in the North West province, and Kgorathuto High School in situated in the township of Botshabelo, out side Bloemfontein in the Free State province.“One of the pillars supporting Ubank’s business strategy is community development – investing in the communities in which we operate, forms a critical part of how we do business and our constant effort has been to help uplift the lives not only of our customers, but also their families,” says Ubank chief executive officer Luthando Vutula.Ubank has prioritised sustainable educational programmes in partnership with various district offices of the Department of Education (Image: Ubank)According to Vutula, Ubank has prioritised sustainable educational programmes in partnership with various district offices of the Department of Education and also assisting schools in improving their facilities.“Every child deserves not just books and good facilities to study, but also proper uniforms and clothing that is protective,” said Vutula. “We focused our efforts on helping out the learners of Matlhaleng Secondary School because this is one of the most challenged schools in the area.“We are extremely pleased that through our efforts the school is being able to overcome at least one of the challenges to ensure an improved learning experience.”The campaign to increase the number of shoes to be raised is ongoing with North WestFM and has to date raised a further 2 000 pairs.ADOPT A SCHOOLAs part of the its ongoing corporate social investment initiatives, Ubank adopted Kgorathuto High School in September 2013, when the bank’s Bloemfontein branch was opened.Thereafter Ubank assisted the school in overcoming their most pressing issues, including the repairing of ceilings in 10 classrooms, maintaining ablution facilities for learners, painting three classrooms and providing school uniforms for 100 learners at the school.“Ubank has always believed in maintaining the relationships that were started, and this was the reason why the team focused on Kgorathuto High School as one of the schools that benefited from our ‘Back to School’ campaign,” said Ubank chief operations officer Bungane Radebe. “Shoes were one of the only elements that we were not able to help the learners with last year, and now we have fulfilled this need as well.”
Why Tech Companies Need Simpler Terms of Servic… 8 Best WordPress Hosting Solutions on the Market No, that’s not a typo in the headline. You’ve heard of virtual sprawl already, no doubt, and you may have experienced it in your own company. But virtual stall or VM stall refers to a concept coined by CA Technologies, from a landmark white paper published in August 2010.Essentially, the idea is this: Many enterprises have launched their boats already, and are heading toward a virtualized data center or a reputable hybrid cloud solution. The problem is, they’re not halfway there yet, and the reasons are often more political than technological.The broader topic of virtualizing large business workloads is the topic of a panel I’ll be moderating today at 1:00 pm ET / 10:00 am PT, with VMware Senior Systems Engineer Stephen Shultz and Intel Mission-Critical Data Center Strategist Mitchell Shults. (You read right, Shultz and Shults.) It’s a live chat, and I’d like to have you join in, especially with respect to this very topic: how to realign the organization around retooled, virtualized resources.Yesterday, I talked about the problem of departmental silos that fail to align with the resource pools and shared servers that are the by-products of the virtual data center. Whereas a few years ago, certain managers were in charge of particular boxes, today those boxes don’t physically exist. But their jobs do.“Virtual stall” is a bigger problem, which I discussed in greater detail in a story earlier in the summer. It’s more of a political/technological mashup, or what we used to call in the 20th century “sociology,” where issues that crop up in the business of migrating or transitioning systems lead to breakdowns in communication, which lead to the inevitable problems of whom to blame.Here’s an excerpt from the CA white paper from Mason Bradbury and Andi Mann:“Even some IT departments are skeptical of moving key systems to virtual environments for fear of the ‘guilty until proven innocent’ phenomenon – in which the virtualization infrastructure is blamed for problems until it is definitively shown otherwise. As resources are delivered to applications in pools, application specialists and others outside of the systems management department have far less visibility than in a traditional physical environment into the systems on which their applications run. When problems arise, employees are far more likely to blame the virtual infrastructure, as they cannot troubleshoot it themselves. Even if the problem has to do with the operating system or application itself, virtualization administrators worry that application owners will blame them until they can locate the problem and show that it is not a fault in the virtual infrastructure.”Since that white paper was published, other firms including Symantec have followed up with new research. For example, take the case of the admin who happens to be the script writing expert in the IT shop. This is the go-to person for ad hoc issues that crop up, and who eventually becomes credited with finding the whirlwind solutions to every virtualization transition issue that admins fail to address. If the need for solutions were to, say, go away, then the level of appreciation they receive as a result would also subside. As the Symantec report published last June relates, some IT shops get hung up on never-ending remedial projects that require one-time script writing — for example, “root cause analysis.” There may or may not be actual analysis involved, but the process becomes self-sustaining, taking on a life of its own. And Symantec has started measuring the side-effects in terms of dollars.As Symantec’s Jennifer Ellard put it, “The admins simply delay virtualization which costs them increased operational complexity and inability to consolidate physical servers and end up having to pay for the energy and space utilization.”So here’s the problem: In environments where virtualization deployment takes place in stages, the workloads that businesses would most readily classify as mission critical – the big databases, the e-mail servers, messaging, collaboration, ERP — get pushed to the bottom of the agenda. And there they sit, waiting for the political issues to resolve themselves further up the ladder.We’ll be discussing the entire topic of mission critical workload virtualization this afternoon in the RWW live chat. I hope to see you there. Top Reasons to Go With Managed WordPress Hosting A Web Developer’s New Best Friend is the AI Wai… Related Posts scott fulton Tags:#cloud#Virtualization
Many business and IT leaders are focused on developing comprehensive data strategies that enable data-driven decision making. A 2016 IDG Enterprise survey found that 53% of companies were implementing or planning to implement data-driven projects within the next 12 months—specifically projects undertaken with the goal of generating greater value from existing data.1 With the growing importance of AI and advanced analytics today, it seems a safe assumption that this number has only increased over time.The concept of building a data strategy is such a hot topic that top-tier universities are creating executive-level courses on the subject,2 while industry observers are predicting that by 2020, 90% of Fortune 500 companies will have a chief data officer (CDO) or equivalent position.3Yet despite all of this momentum, the concept of a data strategy remains new to many organizations. They haven’t thought about it in the past, so it is uncharted territory, or maybe even an unknown-unknown. With that thought in mind, in this post, I will walk through some key considerations for building a robust data strategy.Why is a robust data strategy important? A data strategy is a business-driven initiative, and how technology is involved is an important factor. No matter what, you always start with a set of business objectives, and having the right data when you need it results in business advantages.The Big PictureA well-thought-out data strategy will have components specific to one’s own organization and application area. There are, however, important commonalities to any approach. Some of the more important ones include methods for data acquisition, data persistence, feature identification and extraction, analytics, and visualization, three of which I will discuss here.When I give talks about the data science solutions my team develops, I often reference a diagram describing how many data scientists organize the information flow through their experiments. A good data strategy needs to be informed by these concepts—your choices will either facilitate or hinder how your analysts are able to extract insights from your data!Figure 1. The standard data science workflow for experimental model creation and production solution deployment. EDA: Exploratory Data Analysis.Data Acquisition and PersistenceBefore outlining a data strategy, one needs to enumerate all the sources of data that will be important to the organization. In some businesses, these could be real-time transactions, while in others these could be free-text user feedback or log files from climate control systems. While there are countless potential sources of data, the important point is to identify all of the data that will play into the organization’s strategy at the outset. The goal is to avoid time-consuming additional steps further along in the process.In one project I worked on when I was but a wee data scientist, we needed to obtain free-text data from scientific publications and merge the documents with metadata extracted from a second source. The data extraction process was reasonably time-consuming, so we had to do this as a batch operation and store the data to disk. After we completed the process of merging together our data sources, I realized I forgot to include a data source we were going to need for annotating some of the scientific concepts in our document corpus. Because we had to do a separate merge step, our experimental workflow took a great deal more time, necessitating many avoidable late hours at the office. The big lesson here: Proactively thinking through all the data that will be important to your organization is a guaranteed way to save some headaches down the road.Once you have thought through data acquisition, it’s easier to make decisions about how (or if) these data will persist and be shared over time. To this end, there have never been more options for how one might want to keep data around. Your choices here should be informed by a few factors, including the data types in question, the speed at which new data points arrive (e.g., is it a static data set or real-time transactional data?), whether your storage needs to be optimized for reading or writing data, and which internal groups are likely to need access. In all likelihood, your organization’s solution will involve a combination of several of these data persistence options.Your choices are also likely to change in big versus small data situations. How do you know if you have big data? If it won’t fit in a standard-size grocery bag, you may have big data. In all seriousness though, my rule of thumb is, once infrastructure (i.e., the grocery bag) is a central part of your data persistence solution, one is effectively dealing with big data. There are many resources that will outline the advantages and disadvantages of your choices here. These days, many downstream feature extraction and analytical methods have libraries for transacting with the more popular choices here, so it’s best to base one’s decision on expected data types, optimizations, and data volume.Feature Identification and ExtractionIn data science, a “feature” is the information a machine learning algorithm will use during the training stage for a predictive model, as well as what it will use to make a prediction regarding a previously-unseen data point. In the case of text classification, features could be the individual words in a document; in financial analytics, a feature might be the price of a stock on a particular day.Most data strategies would do well to steer away from micromanaging how the analysts will approach this step of their work. However, there are organization-level decisions that can be made that will facilitate efficiency and creativity here. The most important approach, in my mind, is fostering an environment that encourages developers to draw from, and contribute to, the open source community. This is essential.Many of the most effective and common methods for feature extraction and data processing are well-understood, and excellent approaches have been implemented in the open source community (e.g., in Python*, R*, or Spark*). In many situations, analysts will get the most mileage out of trying one of these methods. In a research setting, they may be able to try out custom methods that are effective in a particular application domain. It will benefit both employee morale and your organization’s reputation if they are encouraged to contribute these discoveries back to the open source community.Predictive AnalyticsAgain, I think it’s key for an organization-level data strategy to avoid micromanagement of the algorithm choices analysts make in performing predictive analytics, but I would still argue that there are analytical considerations that should be included in a robust data strategy. Overseeing data governance—the management of the availability, usability, integrity, and security of your organization’s data is a central part of the CDO’s role—and analytics is where a lot of this can breakdown or reveal holes in your strategy. Even if your strategy leverages NoSQL databases, if the relationships between data points are poorly understood or not documented, it’s possible that the analysts could be missing important connections, or even prevented from accessing certain data altogether.Overarching ConsiderationsTo take a step back, a data strategy should include identification of software tools that your organization will rely upon. Intel can help here. Intel has led or contributed actively to the development of a wide range of platforms, libraries, and programming languages that provide ready-to-use resources for data analytics initiatives.To help with analytical steps and some aspects of feature identification and extraction, you can leverage the Intel® Math Kernel Library (Intel® MKL), Intel® Math Kernel Library for Deep Neural Networks (Intel® MKL-DNN) and the Intel® Data Analytics Acceleration Library (Intel® DAAL), as well as BigDL and the Intel® Distribution for Python*.Intel® MKL arms you with highly optimized, threaded, and vectorized functions to increase performance on Intel processors.Intel® MKL-DNN provides performance enhancements for accelerating deep learning frameworks on Intel architecture.Intel® DAAL delivers highly tuned functions for deep learning, classical machine learning, and data analytics performance.BigDL simplifies the development of deep learning applications for use as standard Spark programs.The Intel® Distribution for Python adds acceleration of Python application performance on Intel platforms.Ready for a deeper dive? Our “Tame the Data Deluge” whitepaper is a great place to get started. For some real-life examples of the way organizations are using data science to make better decisions in less time, visit the Intel Advanced Analytics site. 1 IDG Enterprise Data and Analytics Survey 2016.2 For an example, see Data Strategy for Business LeadersOpens in a new window, an educational offering from the Haas School of Business at the University of California, Berkeley.3 DATAVERSITY, “2017 Trends in Data Strategy,” December 13, 2016.
KhwaabbCast: Navdip Singh, Simer Motiani, Bajrangbali Singh, Rishi Miglani, Nafisa AliDirector: Zaid Ali KhanRating: A poster of the movie Khwaabb.The film promises to tell the story of an athlete’s dreams, struggles, and determination but after seeing the film you can’t help but feel a little disappointed mostly because of the film’s weak storyline. There’s hardly anything worth mentioning here.Sanjay and Kiran belong to the same village. And Sanjay has developed quite a liking for Kiran. While Kiran is an ace swimmer, Sanjay does nothing and knows no better. Only he can run pretty fast. So far he’s been living a life dealing with his alcoholic father’s beatings. And no sooner than you can spell your own name, the head of sports academy, Ram Prasad Lakshman spots the talented two, and brings them back to his academy.Kiran makes the most of this opportunity and works hard on her talent, while Sanjay only has eyes for her and not on the track or field. And so he doesn’t even approve of Kiran’s friendship with a boy called Samir. Soon enough Kiran and Sanjay end up dealing with issues at the academy that make them realise that their dream is the only thing they’ve got. But they’ve yet to deal with something as serious as a dope test. And whether they’ll be able to clear the dope test and realise their dreams is exactly what the film tries to focus on.The film fails at dealing with the small details that make the big picture. There’s nothing on the charm of sports, nor any in-depth look at the lives of athletes. Everything looks a bit superficial.advertisementThe acting, too, isn’t compelling enough, and hardly does much to compensate for the lack of a good story. All in all, the movie fails to impress and barely even manages to make it to the ‘average’ slot.