Tag: Incredible

  • Before you click on that incredible deal…

    Before you click on that incredible deal…

    Scammers are everywhere on the internet, masquerading to obtain your personal information. Many social media users or website creators pose as government entities or other authorities to offer you things that seem too good to be true or use scare tactics, like fake warnings about things like late fines or missed court dates, to prompt online users into sharing personal information. 

    In an era of misinformation, how do we know when a website is real? 

    One way is to research a website’s domain. A domain name is the part of a website address preceded by .com, .net or other popular suffixes. It’s essentially just the base website name without the “https://” and “www.”

    “Measuring a website’s credibility might take time,” said Jordan Lyle, a senior reporter for Snopes.com. “Young journalists should know their stuff when it comes to domains and redirects.” 

    Snopes.com is one of the internet’s oldest fact-checking websites. He has more than 25 years of experience in managing websites and knows how to determine whether a site is legit. 

    Investigating internet sites

    Alex Kasprak, a former investigative journalist at Snopes.com, has conducted numerous investigations using information gleaned from Domain Name Server (DNS) registers. DNS registers contain information about a particular website, its URL and IP address — a unique number on every tech device you might use. 

    With the information he found, Kasprak has been able to uncover unreported connections between news websites and their funders and between scammers and their beneficiaries. 

    “DNS tools are a great first step into any investigation that involves the identity of people behind websites or possible undisclosed connections between them,” Kasprak said.

    Taking the expertise from these two investigative reporters, News Decoder has compiled the toolkit below to help perform a credible and comprehensive examination for publishing. 

    Are there red flags?

    Scam websites have certain red flags. They might lack legal documentation, for example, including terms of service and privacy policies. 

    Another sign is sloppiness and mistakes. Try skimming through various pages on the site to look for typos, glaringly incorrect information, vague contact information, skewed formatting and other things that seem unprofessional. 

    Lyle said that a website that promotes a specific giveaway might lack any biographical or contact information about the people promoting the product or offer.

    “Sometimes, scammers will include a mailing address that, upon searching for it, turns out to be a fulfillment center or a business that allows LLCs to anonymously register with that business’ physical office as a virtual address, shielding the scam’s operators from being identified,” Lyle said. 

    Conduct a website domain search.

    Kasprak said that the Internet Corporation for Assigned Names and Numbers (ICANN) operates as a phonebook for the internet.

    “In this analogy, the phone numbers are Internet Protocol (IP) addresses  — a string of numbers formatted like 0.0.0.0 — and the ‘names’ are the actual domain names [e.g. news-decoder.com] to which those IP addresses are associated,” Kasprak said. “Like a human with a phone, domain names can change IP addresses several times.”

    The first step for tracing the origins of a website involves what’s known as a “WHOIS” search — a specific type of domain search listing information about the creation of a domain. 

    WHOIS is a public database that lists several contact numbers, names or organisations associated with a given IP address or domain name. Many people these days use services that allow one to register a website anonymously, making the results have limited value. Older records, or those from some non-Western nations, often include actual names or corporate contacts, explained Kasprak. 

    A WHOIS search, which can be conducted at godaddy.com/whois, queries the public WHOIS database. 

    Lyle said he often looks at the date a person officially purchased and registered a domain name.. “For example, in the case of researching potential scams, if a domain name was recently registered, that’s a red flag indicating the website might be untrustworthy and could confirm the potential scam as legitimate,” he said.

    Look at the site history.

    Another great tool to pair with “WHOIS” searches is the Internet Archive’s Wayback Machine. When performing a “WHOIS” search on godaddy.com/whois, check to see when the domain was created. That year should match the Wayback Machine’s records of creation date, as well as show if the website had other owners with completely different websites. 

    “Also, know that the domain information listed in a WHOIS search might be the most recent data, but not the original data,” Lyle said. “Check the Wayback Machine to see if the website existed long ago in another form.”

    Scammers might also create fake domains to pretend to be a legitimate business, adjusting the URL link slightly to trick users. A fake Home Depot ad on Facebook, for example, didn’t lead to homedepot.com when clicked through, but instead to “h0medepott.com”; an “o” was changed to a zero and a second “t” was added to the end of the URL. 

    “Scammers have created fake domains almost matching the genuine business domain for banks, as well as for USPS, for example,” Lyle said. “Sometimes, scammers won’t even bother to create similar domain names and instead simply rely on people not looking at the URL.” 

    Some scammers go so far as to copy the web design of a company — logo and all — to trick consumers. These types of scam websites often offer giveaways that seem too good to be true, such as free money, super inexpensive offers for goods or services or non-existent programs for student loan forgiveness.

    “Of course, the biggest red flag would be an offer that seems too good to be true,” Lyle said. “If an offer seems too good to be true, it probably is. And I will go a step further: In 2025, if an offer seems too good to be true, it is. Avoid it.”

    For journalists all this should becoming standard practice when using information off the internet in news stories. 

    “Basically, you want to make sure you did everything you could with your research before publishing your article,” Lyle said. “And that you attempted to go above and beyond expectations other publishers might have for their articles’ comprehensive credibility.”


     

    Questions to consider:

    1. What are some common red flags that a website might be fake or trying to scam you?

    2. What is a DNS register and how is it useful to identify a potential scam?

    3. If a friend sent you an unknown link, what steps would you take before clicking? How would you explain your choice to click or not?


     

    Source link

  • TEF6: the incredible machine takes over quality assurance regulation

    TEF6: the incredible machine takes over quality assurance regulation

    If you loved the Teaching Excellence Framework, were thrilled by the outcomes (B3) thresholds, lost your mind for the Equality of Opportunity Risk Register, and delighted to the sporadic risk-based OfS investigations based on years-old data you’ll find a lot to love in the latest set of Office for Students proposals on quality assurance.

    In today’s Consultation on the future approach to quality regulation you’ll find a cyclical, cohort based TEF that also includes a measurement (against benchmarks) of compliance with the thresholds for student outcomes inscribed in the B3 condition. Based on the outcomes of this super-TEF and prioritised based on assessment of risk, OfS will make interventions (including controls on recruitment and the conditions of degree awarding powers) and targeted investigation. This is a first stage consultation only, stage two will come in August 2026.

    It’s not quite a grand unified theory: we don’t mix in the rest of the B conditions (covering less pressing matters like academic standards, the academic experience, student support, assessment) because, in the words of OfS:

    Such an approach would be likely to involve visits to all providers, to assess whether they meet all the relevant B conditions of registration

    The students who are struggling right now with the impacts of higher student/staff ratios and a lack of capacity due to over-recruitment will greatly appreciate this reduction in administrative burden.

    Where we left things

    When we last considered TEF we were expecting an exercise every four years, drawing on provider narrative submissions (which included a chunk on a provider’s own definition and measurement of educational gain), students’ union narrative submissions, and data on outcomes and student satisfactions. Providers were awarded a “medal” for each of student outcomes and student experience – a matrix determined whether this resulted in an overall Bronze, Silver, Gold or Requires Improvement.

    The first three of these awards were deemed to be above minimum standards (with slight differences between each), while the latter was a portal to the much more punitive world of regulation under group B (student experience) conditions of registration. Most of the good bits of this approach came from the genuinely superb Pearce Review of TEF conducted under section 26 of the Higher Education and Research Act, which fixed a lot of the statistical and process nonsense that had crept in under previous iterations and then-current plans (though not every recommendation was implemented).

    TEF awards were last made in 2023, with the next iteration – involving all registered providers plus anyone else who wanted to play along – was due in 2027.

    Perma-TEF

    A return to a rolling TEF rather than a quadrennial quality enhancement jamboree means a pool of TEF assessors rather than a one-off panel. There will be steps taken to ensure that an appropriate group of academic and student assessors is selected to assess each cohort – there will be special efforts made to use those with experience of smaller, specialist, and college-based providers – and a tenure of two-to-three years is planned. OfS is also considering whether its staff can be included among the storied ranks of those empowered to facilitate ratings decisions.

    Likewise, we’ll need a more established appeals system. Open only to those with Bronze or Needs Improvement ratings (Gold and Silver are passing grades) it would be a way to potentially forestall engagement and investigations based on an active risk to student experience or outcomes, or a risk of a future breach of a condition of registration for Bronze or Requires Improvement.

    Each provider would be assessed once every three years – all providers taking part in the first cycle would be assessed in either 2027-28, 2028-29, or 2029-30 (which covers only undergraduate students because there’s no postgraduate NSS yet – OfS plan to develop one before 2030). In many cases they’ll only know which one at the start of the academic year in question, which will give them six months to get their submissions sorted.

    Because Bronze is now bad (rather than “good but not great” as it used to be) the first year’s could well include all providers with a 2023 Bronze (or Requires Improvement) rating, plus some with increased risks of non-compliance, some with Bronze in one of the TEF aspects, and some without a rating.

    After this, how often you are assessed depends on your rating – if you are Gold overall it is five years till the next try, Silver means four years, and Bronze three (if you are “Requires Improvement” you probably have other concerns beyond the date of your next assessment) but this can be tweaked if OfS decides there is an increased risk to quality or for any other reason.

    Snakes and ladders

    Ignore the gradations and matrices in the Pearce Review – the plan now is that your lowest TEF aspect rating (remember you got sub-awards last time for student experience and student outcomes) will be your overall rating. So Silver for experience and Bronze for outcomes makes for an overall Bronze. As OfS has decided that you now have to pay (likely around £25,000) to enter what is a compulsory exercise this is a cost that could lead to a larger cost in future.

    In previous TEFs, the only negative consequence for those outside of the top ratings have been reputational – a loss of bragging rights of, arguably, negligible value. The new proposals align Bronze with the (B3) minimum required standards and put Requires Improvement below these: in the new calculus of value the minimum is not good enough and there will be consequences.

    We’ve already had some hints that a link to fee cap levels is back on the cards, but in the meantime OfS is pondering a cap on student numbers expansion to punish those who turn out Bronze or Requires Improvement. The workings of the expansion cap will be familiar to those who recall the old additional student numbers process – increases of more than five per cent (the old tolerance band, which is still a lot) would not be permitted for poorly rated providers.

    For providers without degree awarding powers it is unlikely they will be successful in applying for them with Bronze and below – but OfS is also thinking about restricting aspects of existing providers DAPs, for example limiting their ability to subcontract or franchise provision in future. This is another de facto numbers cap in many cases, and is all ahead of a future consultation on DAPs that could make for an even closer link with TEF.

    Proposals for progression

    Proposal 6 will simplify the existing B3 thresholds, and integrate the way they are assessed into the TEF process. In a nutshell, the progression requirement for B3 would disappear – with the assessment made purely on continuation and completion, with providers able to submit contextual and historic information to explain why performance is not above the benchmark or threshold as a part of the TEF process.

    Progression will still be considered at the higher levels of TEF, and here contextual information can play more of a part – with what I propose we start calling the Norland Clause allowing providers to submit details of courses that lead to jobs that ONS does not consider as professional or managerial. That existing indicator will be joined by another based on (Graduate Outcomes) graduate reflections on how they are using what they have learned, and benchmarked salaries three years after graduation from DfE’s Longitudinal Educational Outcomes (LEO) data – in deference to that random Kemi Badenoch IFS commission at the tail end of the last parliament.

    Again, there will be contextual benchmarks for these measures (and hopefully some hefty caveating on the use of LEO median salaries) – and, as is the pattern in this consultation, there are detailed proposals to follow.

    Marginal gains, marginal losses

    The “educational gains” experiment, pioneered in the last TEF, is over: making this three times that a regulator in England has tried and failed to include a measure of learning gain in some form of regulation. OfS is still happy for you to mention your education gain work in your next narrative submission, but it isn’t compulsory. The reason: reducing burden, and a focus on comparability rather than a diversity of bespoke measures.

    Asking providers what something means in their context, rather than applying a one-size-fits-all measure of student success was an immensely powerful component of the last exercise. Providers who started on that journey at considerable expense in data gathering and analysis may be less than pleased at this latest development – and we’d certainly understood that DfE were fans of the approach too.

    Similarly, the requirement for students to feed back on students in their submissions to TEF has been removed. The ostensible reason is that students found it difficult last time round – the result is that insight from the valuable networks between existing students and their recently graduated peers is lost. The outcomes end of TEF is now very much data driven with only the chance to explain unusual results offered. It’s a retreat from some of the contextual sense that crept in with the Pearce Review.

    Business as usual

    Even though TEF now feels like it is everywhere and for always, there’s still a place for OfS’ regular risk-based monitoring – and annex I (yes, there’s that many annexes) contains a useful draft monitoring tool.

    Here it is very good to see staff:student ratios, falling entry requirements, a large growth in foundation year provision, and a rapid growth in numbers among what are noted as indicators of risk to the student experience. It is possible to examine an excellent system designed outside of the seemingly inviolate framework of the TEF where events like this would trigger an investigation of provider governance and quality assurance processes.

    Alas, the main use of this monitoring is to decide whether or not to bring a TEF assessment forward, something that punts an immediate risk to students into something that will be dealt with retrospectively. If I’m a student on a first year that has ballooned from 300 to 900 from one cycle to the next there is a lot of good a regulator can do by acting quickly – I am unlikely to care whether a Bronze or Silver award is made in a couple of years’ time.

    International principles

    One of the key recommendations of the Behan review on quality was a drawing together of the various disparate (and, yes, burdensome) streams of quality and standards assurance and enhancement into a unified whole. We obviously don’t quite get there – but there has been progress made towards another key sector bugbear that came up both in Behan and the Lords’ Industry and Regulators Committee review: adherence to international quality assurance standards (to facilitate international partnerships and, increasingly, recruitment).

    OfS will “work towards applying to join the European Quality Assurance Register for Higher Education” at the appropriate time – clearly feeling that the long overdue centring of the student voice in quality assurance (there will be an expanded role for and range of student assessors) and the incorporation of a cyclical element (to desk assessments at least) is enough to get them over the bar.

    It isn’t. Principle 2.1 of the EQAR ESG requires that “external quality assurance should address the effectiveness of the internal quality assurance processes” – philosophically establishing the key role of providers themselves in monitoring and upholding the quality of their own provision, with the external assurance process primarily assessing whether (and how well) this has been done. For whatever reason OfS believes the state (in the form of the regulator) needs to be (and is capable of being!) responsible for all, quality assurance everywhere, all the time. It’s a glaring weakness of the OfS system that urgently needs to be addressed. And it hasn’t been, this time.

    The upshot is that while the new system looks ESG-ish, it is unlikely to be judged to be in full compliance.

    Single word judgements

    The recent use of single headline judgements of educational quality being used in ways that have far reaching regulatory implications is hugely problematic. The government announced the abandonment of the old “requires improvement, inadequate, good, and outstanding” judgements for schools in favour of a more nuanced “report card approach” – driven in part by the death by suicide of headteacher Ruth Perry in 2023. The “inadequate” rating given to her Cavendish Primary School would have meant forced academisation and deeper regulatory oversight.

    Regulation and quality assurance in education needs to be rigorous and reliable – it also needs to be context-aware and focused on improvement rather than retribution. Giving single headline grades cute, Olympics-inspired names doesn’t really cut it – and as we approach the fifth redesign of an exercise that has only run six times since 2016 you would perhaps think that rather harder questions need to be asked about the value (and cost!) of this undertaking.

    If we want to assess and control the risks of modular provision, transnational education, rapid expansion, and a growing number of innovations in delivery we need providers as active partners in the process. If we want to let universities try new things we need to start from a position that we can trust universities to have a focus on the quality of the student experience that is robust and transparent. We are reaching the limits of the current approach. Bad actors will continue to get away with poor quality provision – students won’t see timely regulatory action to prevent this – and eventually someone is going to get hurt.

    Source link

  • These teens can do incredible math in their heads but fail in a classroom

    These teens can do incredible math in their heads but fail in a classroom

    When I was 12, my family lived adjacent to a small farm. Though I was not old enough to work, the farm’s owner, Mr. Hall, hired me to man his roadside stand on weekends. Mr. Hall had one rule: no calculators. Technology wasn’t his vibe. 

    Math was my strong suit in school, but I struggled to tally the sums in my head. I weighed odd amounts of tomatoes, zucchini and peppers on a scale and frantically scribbled calculations on a notepad. When it got busy, customers lined up waiting for me to multiply and add. I’m sure I mischarged them.

    I was thinking about my old job as I read a quirky math study published this month in the journal Nature. Nobel Prize winning economists Abhijit Banerjee and Esther Duflo, a husband and wife research team at MIT, documented how teenage street sellers who were excellent at mental arithmetic weren’t good at rudimentary classroom math. Meanwhile, strong math students their same age couldn’t calculate nearly as well as impoverished street sellers.

    “When you spend a lot of time in India, what is striking is that these market kids seem to be able to count very well,” said Duflo, whose primary work in India involves alleviating poverty and raising the educational achievement of poor children.  “But they are really not able to go from street math to formal math and vice versa.”

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    In a series of experiments, Duflo’s field staff in India pretended to be ordinary shoppers and purposely bought unusual quantities of items from more than 1,400 child street sellers in Delhi and Kolkata. A purchase might be 800 grams of potatoes at 20 rupees per kilogram and 1.4 kilograms of onions at 15 rupees per kilogram. Most of the child sellers quoted the correct price of 37 rupees and gave the correct change from a 200 rupee note without using a calculator or pencil and paper. The odd quantities were to make sure the children hadn’t simply memorized the price of common purchases. They were actually making calculations. 

    However, these same children, the majority of whom were 14 or 15 years old, struggled to solve much simpler school math problems, such as basic division. (After making the purchases, the undercover shoppers revealed their identities and asked the sellers to participate in the study and complete a set of abstract math exercises.)

    The market sellers had some formal education. Most were attending school part time, or had previously been in school for years.

    Duflo doesn’t know how the young street sellers learned to calculate so quickly in their heads. That would take a longer anthropological study to observe them over time. But Duflo was able to glean some of their strategies, such as rounding. For example, instead of multiplying 490 by 20, the street sellers might multiply 500 by 20 and then remove 10 of the 20s, or 200. Schoolchildren, by contrast, are prone to making lengthy pencil and paper calculations using an algorithm for multiplication. They often don’t see a more efficient way to solve a problem.

    Lessons from this research on the other side of the world might be relevant here in the United States. Some cognitive psychologists theorize that learning math in a real-world context can help children absorb abstract math and apply it in different situations. However, this Indian study shows that this type of knowledge transfer probably won’t happen automatically or easily for most students. Educators need to figure out how to better leverage the math skills that students already have, Duflo said. Easier said than done, I suspect.  

    Related: Do math drills help children learn?

    Duflo says her study is not an argument for either applied or abstract math.  “It would be a mistake to conclude that we should switch to doing only concrete problems because we also see that kids who are extremely good at concrete problems are unable to solve an abstract problem,” she said. “And in life, at least in school life, you’re going to need both.” Many of the market children ultimately drop out of school altogether.

    Back at my neighborhood farmstand, I remember how I magically got the hang of it and rarely needed pencil and paper after a few months. Sadly, the Hall farm is no longer there for the town’s children to practice mental math. It’s now been replaced by a suburban subdivision of fancy houses. 

    Contact staff writer Jill Barshay at 212-678-3595 or [email protected].

    This story about applied math was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    Source link

  • 10 Keys to Building an Incredible Brand for Academics LIVE EVENT

    10 Keys to Building an Incredible Brand for Academics LIVE EVENT

    I’m teaming up with Dr. Sheena Howard to bring you a live VIP event for academics. You’re invited!

    EVENT: 10 Keys to Building an Incredible Brand as an Academic

    • Increase your confidence
    • Make 5x your investment
    • Waste far less time because you have an actionable plan
    • 5+ free tools to help you implement what you learn
    • 3+ downloadable PDFs
    • Grow your following by at least 25%

    Date: December 10 or December 11

    Time: 2-4pm Eastern Time

    Where: Virtual (on Zoom)

    Can’t make it live? A replay will be sent to you.

    This event is complete. Thank you for coming!

    A dream team collaboration event

    10 Keys to Build an Incredible Brand for Academics, a graphic with a large key and photos of Dr. Sheena Howard and Jennifer van Alstyne smiling. When: Saturday, December 10 and Sunday, December 11 from 2-4pm Eastern Time.

    I’m Jennifer van Alstyne. When I started building my brand, I wanted to create the academic life that I wanted. Not the life my advisors or mentors wanted for me. My online presence helped me take that step. Now I help professors build an online presence their research deserves. So that they feel more confident, help more people, and build their scholarly community online.

    I’m so excited to team up with the incredible Dr. Sheena Howard. She’s an expert at helping professors get the media attention they deserve. She’s all about building your visibility, authority, and income with Power Your Research.

    This event has ended. It was on December 10 and December 11, 2022.

    We can’t wait to see you at 10 Keys to Building an Incredible Brand for Academics, our live virtual event.

    This event is for all people with an advanced degree (like a master’s or doctorate). This event is for you whether you’re in or out of the academy.

    Topics covered will include but are not limited to

    • Best free resources to get high-level media coverage right away.
    • Getting clarity on what building your brand looks like for you.
    • Building an incredible website that stands out.
    • Social media plan and strategy for the busy academic.
    • Building a 6-figure brand by leveraging your academic credentials, whether you are in or out of academe.

    This event only happens once a year. You don’t want to miss it.

    Get tickets for 10 Keys to Build an Incredible Brand for Academics today. Limited seats are available.

    This event is complete. It was on December 10 and December 11, 2022. Thank you for attending, we were so happy to help inspire you. This event only happens once a year. If you’re interested in attending next year, email me at [email protected]

    Gifts and Holiday Guides and Advice Articles The Social Academic

    Source link