Everything posted by Cpvr
-
Closed Vs. Open Community
Open vs. Closed Communities: Which is Right for You? When it comes to building a community, the first key decision is whether it should be open (public) or closed (private). This guide will help you make an informed choice based on your goals and audience. As someone passionate about community building, I firmly believe every online business should foster a community around its brand. Over the years, I’ve explored the pros and cons of both open and closed communities—here’s what I’ve learned. Understanding Open Communities An open community makes its content publicly accessible. Users can browse content without logging in, but typically need to create an account to engage, such as liking, commenting, or posting. Examples of Open Communities: Reddit, Quora, public Facebook Groups, and traditional forums are classic examples. You can explore content without signing up, making them easy to discover and join. Advantages of Open Communities: 1. Exponential Growth: SEO-friendly, driving organic traffic. 2. Broader Reach: Content visibility enhances brand awareness. 3. SEO Benefits: User-generated content boosts search rankings. 4. Network Effect: Larger audiences lead to richer discussions. 5. Lower Marketing Costs: Public visibility reduces advertising spend. 6. Free Promotion: Discussions often get shared on social media. 7. Easy Onboarding: Low entry barriers encourage participation. 8. Growth Driver: Open communities double as marketing engines. Disadvantages of Open Communities: 1. Spam Risk: Both bots and human spammers can be an issue. 2. Moderation Challenges: Larger communities require more oversight. 3. Privacy Concerns: Public content may deter sensitive discussions. 4. Content Quality Issues: Ensuring relevance and accuracy can be tough. 5. Security Risks: Public data is vulnerable to scraping and misuse. 6. Limited Trust: Anonymity can hinder meaningful connections. 7. Misinformation: Open platforms are prone to spreading false information. With proper tools like moderation systems and spam filters, these drawbacks can be managed effectively. Understanding Closed Communities Closed communities, on the other hand, restrict access to their content. Users must log in, receive an invitation, or pay for membership to participate. Examples of Closed Communities: Slack, Discord, WhatsApp groups, private Facebook or LinkedIn groups. Advantages of Closed Communities: 1. Controlled Membership: Owners can vet and select participants. 2. Enhanced Privacy: Content stays within a trusted group. 3. Focused Content: Discussions tend to stay relevant and high-quality. 4. Reduced Noise: Smaller, more engaged audiences create less clutter. 5. Monetization: Easier to implement paid memberships or premium content. 6. Professional Environment: Members often behave more professionally. Disadvantages of Closed Communities: 1. Limited Growth: Restricted access means slower member acquisition. 2. No SEO Benefits: Hidden content doesn’t attract organic traffic. 3. High Maintenance: Keeping members engaged is labor-intensive. 4. Lower Discoverability: Harder for new users to find and join. 5. Monotony Risk: Smaller communities may lack diverse perspectives. 6. Operational Costs: Requires dedicated management resources. 7. Not Marketing-Friendly: Closed setups don’t support brand visibility. Open vs. Closed: Which Should You Choose? For most online businesses, an open community is the better choice. Open communities drive organic growth, enhance brand visibility, and reduce marketing costs. They’re ideal for businesses that want to showcase user-generated content (UGC) and leverage it for SEO benefits. However, there are cases where a closed community makes sense. If privacy, exclusivity, or monetization through memberships is a priority—like for course creators or premium groups—a closed setup might be more suitable. In the end, the decision depends on your goals. If growth, visibility, and accessibility are top priorities, go open. If privacy, control, or monetization matter more, go closed. Choose wisely, and let your community become a powerful asset for your brand.
-
Golden Content Package
I’ve completed my part.
-
How much longer can Facebook stay relevant?
Facebook doesn’t know how to innovate anymore lmao. They’re known for copying ideas and stealing other companies concepts. They’re doing the same thing now with their Threads app.🤣
-
Community Chat Thread
Good evening ya’ll! Happy Thanksgiving!
-
What are you listening to?
I’m currently listening to On go by Lil loonie. [MEDIA=spotify]track:5pQrNaKcmKBd1C1MdVWnzB[/MEDIA]
-
Hi? I'm new
Welcome to the community! How are you doing today? Happy Thanksgiving
-
Golden Content Package
Taking the first slot
-
Hello everyone!
Hey [mention=13]Phillip[/mention] welcome to Agora!
-
Wordpress VS WP Engine
[HEADING=2]WP Engine Vs Automattic: Judge Inclined To Grant Preliminary Injunction[/HEADING] WP Engine's attorney on Mullenweg's $32M demand: "That's not how you calculate a royalty. That's how you set a ransom." Judge indicates she's leaning toward granting "some sort of injunction." Attorneys have until Tuesday, December 3, to present "dueling submissions," which will determine how the judge will rule. https://www.searchenginejournal.com/wp-engine-vs-automattic-judge-inclined-to-grant-preliminary-injunction/533746/
-
What are you listening to?
I’m currently listening to Switched up by Morray. [MEDIA=spotify]track:0zAxe63ZJGboujUs92LFrm[/MEDIA]
-
What are you listening to?
I’m currently listening to Blame yourself by 92Legend. [MEDIA=spotify]track:5vFLPwh696EJyDNEDXnrTz[/MEDIA]
-
What was the last tv show that you watched?
I need to rewatch it again. It’s been a while since I watched it. It’s a pretty good show! I’d say it’s one of the best series Apple TV has released this year. There will be a second season too, so I’m looking forward to watching it when it’s released!
-
What are you listening to?
I’m currently listening to Legacy by Lil Tjay. [MEDIA=spotify]track:0kZLYZ9qpZrQYedUMI7s58[/MEDIA]
-
What are you listening to?
I’m currently listening to Beat the odds by Lil tjay. [MEDIA=spotify]track:2BJWxD8xKrDv8vneTvTIm9[/MEDIA]
-
Is it a good idea to request posting packages?
Yes, it’s a good idea to request a posting package while your board is offline and being built. That way, when you’re ready to open, your forum will be seeded with content rather than being empty. This allows you to hit the ground running, with more content available for new members, as opposed to having limited content when you first launch. A posting package is designed to help you by providing the resources you need to kickstart your forum.
-
Google Rolls Out Search Console Recommendations
In August, Google introduced Google Search Console recommendations but it slowly rolled it out to more and more users over time. Now, Google has fully rolled it out and if Google has Search Console recommendations for your sites, then Google will show it on the home page of the dashboard. Google wrote on linkedin and X, "We're happy to let you know that Recommendations are now available to everyone! Note that you'll see them only if we have a recommendation available for your website." Again, this does not mean you will see the Google Search Console recommendations. Google just might not have recommendations. I checked several Google Search Console properties and only saw recommendations for a couple of them. Here is what I see for this site: Here is a different one: Google explained when it first was announced that, even "after complete rollout, we'll only provide recommendations when we have a recommendation available for your website," Google added. Google said these are the types of recommendations you might see: Issues (something that could be fixed) Opportunities (something that could improve your traffic) Configuration (something that could make your work easier) The recommendations "can help you prioritize your search optimization efforts, such as using structured data to help Google understand your content, adding sitemaps, and checking out trending queries and pages," Google wrote. Source: https://www.seroundtable.com/google-search-console-recommendations-live-38479.html
-
Bigger and badder: how DDoS attack sizes have evolved over the last decade
DDoS) attacks are cyberattacks that aim to overwhelm and disrupt online services, making them inaccessible to users. By leveraging a network of distributed devices, DDoS attacks flood the target system with excessive requests, consuming its bandwidth or exhausting compute resources to the point of failure. These attacks can be highly effective against unprotected sites and relatively inexpensive for attackers to launch. Despite being one of the oldest types of attacks, DDoS attacks remain a constant threat, often targeting well-known or high traffic websites, services, or critical infrastructure. Cloudflare has mitigated over 14.5 million DDoS attacks since the start of 2024 — an average of 2,200 DDoS attacks per hour. (Our DDoS Threat Report for Q3 2024 contains additional related statistics). If we look at the metrics associated with large attacks mitigated in the last 10 years, does the graph show a steady increase in an exponential curve that keeps getting steeper, especially over the last few years, or is it closer to linear growth? We found that the growth is not linear, but rather is exponential, with the slope dependent on the metric we are looking at. Why is this question interesting? Simple. The answer to it provides valuable insights into the evolving strategies of attackers, the sophistication of their tools, and the readiness of defense mechanisms. As an example, an upward curve of the number of requests per second (rps) suggests that the attackers are changing something on their side that enables them to generate larger volumes of requests. This is an insight that prompts us to investigate more and look at other data to understand if anything new is happening. For instance, at one of those moments, we looked at the source of the traffic and saw a shift from subscriber/enterprise IP address space (suggesting IoT) to cloud provider IP address space (suggesting VMs), and realized there was a shift in the type and capabilities of devices used by attackers. As another example: when the HTTP/2 Rapid Reset attack happened, the record number of requests per second seen at that time suggested that a new technique was being employed by attackers, prompting us to swiftly investigate what was being executed and adapt our defenses. [HEADING=1]Defining individual attacks[/HEADING] Delimiting an individual attack in time is surprisingly blurry. First of all, an attack analysis can provide inconsistent observations at different layers of the OSI model. The footprint seen at all these different layers may tell different stories for the same attack. There are, however, some variables that together can allow us to create a fingerprint and enable us to group a set of events, establishing that they are part of the same individual attack. Examples include: Do we see the same attack vector(s) being used across this set of events? Are all the attack events focused on the same target(s)? Do the payloads on events share the same signature? (Specific data payloads or request types unique to certain types of attacks or botnets, like Mirai, which may use distinctive HTTP request headers or packet structures). [HEADING=1]DDoS attack sizes [/HEADING] Before we dive into a growth analysis of DDoS attacks over the last 10 years, let's take a step back and have a look at the metrics typically used to measure them: requests per second (rps), packets per second (pps), and bits per second (bps). Each metric captures a different aspect of the attack's scale and impact. Requests per second (rps): Measures the number of HTTP or similar protocol requests made each second. This metric is particularly relevant for application-layer attacks (Layer 7), where the intent is to overwhelm a specific application or service by overloading its request handling, and is useful for measuring attacks targeting web servers, APIs, or applications because it reflects the volume of requests, not just raw data transfer. Packets per second (pps):Represents the number of individual packets sent to the target per second, regardless of their size. This metric is critical for network-layer attacks (Layers 3 and 4), where the goal is to overwhelm network infrastructure by exceeding its packet-processing capacity. pps measurements are useful for volumetric attacks, identifying a quantity of packets that can impact routers, switches, or firewalls. Bits per second (bps): This measures the total data transferred per second and is especially useful in evaluating network-layer attacks that aim to saturate the bandwidth of the target or its upstream provider. bps is widely used measuring Layer 3 and 4 attacks, such as UDP floods, where the attack intends to clog network bandwidth. This metric is often highlighted for DDoS attacks because high bps values (often measured in gigabits or terabits) signal bandwidth saturation, which is a common goal of large-scale DDoS campaigns. [HEADING=1]Evolution of DDoS attack sizes over the last decade[/HEADING] So, how have DDoS attack sizes changed in the last decade? During this period, DDoS attacks have grown bigger and stronger, each year having the potential to be more disruptive. If we look at the metrics associated with large attacks seen in the last 10 years, does it look like we have a steady increase in an exponential curve that keeps steepening, especially in the last few years, or is it closer to a linear growth? We found that it is exponential, so let’s have a look at the details around why we came to that conclusion. In this analysis, we used attacks that Google has seen from 2010 until 2022 as a baseline (Figure 1) that we extended with attacks that Cloudflare has seen in 2023 and 2024 (Figure 2). Going back in time, early in the 2010s, the largest attacks were measured in the Gigabits per second (Gbps) scale, but these days, it’s all about Terabits per second (Tbps). The number of requests per second (rps) and bits per second (bps) are also significantly higher these days, as we will see. The historical data from Google shown below in Figure 1 reveals a rising trend in requests per second during DDoS attacks observed between 2010 and 2022, peaking at 6 Million requests per second (Mrps) in 2020. The increase highlights a significant escalation in attack volume across the decade. Figure 1. Largest known DDoS attacks, 2010 - 2022. (Source: Google) Figure 2 (below) provides a view of trends seen across the different metrics. The escalation seen in Google’s statistics is also visible in Cloudflare’s data regarding large mitigated DDoS attacks observed in 2023 and 2024, reaching 201 Mrps (green line) in September 2024. The rate of packets per second (pps) demonstrates (blue line) a slight exponential growth over time, rising from 230 Mpps in 2015 to 2,100 Mpps in 2024, suggesting that attackers are achieving higher throughput. For bits per second (bps), the trend is also exponential and with a steeper upwards curve (red line), building from a 309 Gbps attack in 2013 to a 5.6 Tbps (5,600 Gbps) attack in 2024. Over roughly the last decade, attacks driving these metrics have seen significant growth rates: Bits per second increased by 20x between 2013 and 2024 Packets per second increased by 10x between 2015 and 2024 Requests per second increased by 70x between 2014 and 2024 Figure 2. Data from Figure 1 extended with large attacks observed by Cloudflare in 2023 and 2024. The blog posts listed in Table 1 highlight some of the attacks that we observed from 2021 to 2024. Month Attack size Blog post August 2021 17.2 Mrps Cloudflare thwarts 17.2M rps DDoS attack — the largest ever reported April 2022 15 Mrps Cloudflare blocks 15M rps HTTPS DDoS attack June 2022 26 Mrps Cloudflare mitigates 26 million request per second DDoS attack February 2023 71 Mrps Cloudflare mitigates record-breaking 71 million request-per-second DDoS attack September 2024 3.8 Tbps How Cloudflare auto-mitigated world record 3.8 Tbps DDoS attack October 2024 4.2 Tbps 4.2 Tbps of bad packets and a whole lot more: Cloudflare's Q3 DDoS report October 2024 5.6 Tbps 5.6 Tbps attack Table 1. Notable DDoS attacks observed by Cloudflare between 2021 - 2024. An overview of other selected significant high volume DDoS attacks that have occurred over the last decade, including2018’s Memcached abuse and 2023’s HTTP/2 “Rapid Reset” attacks, can be found on the Cloudflare Learning Center. [HEADING=1]Attack duration as a metric[/HEADING] Attack duration is not an effective metric to use to qualify attack aggressiveness because establishing a duration of a single attack or campaign is challenging, due to their possible intermittent nature, the potential for a multitude of attack vectors being used at the same time, or how the different defense layers triggered over time. The attack patterns can differ considerably, with some consisting of a single large spike, while others featuring multiple tightly grouped spikes, or a continuous load maintained over a period of time, along with other changing characteristics. [HEADING=1]Trend in types of devices used to create attacks[/HEADING] DDoS attacks are increasingly shifting from IoT-based botnets to more powerful VM-based botnets. This change is primarily due to the higher computational and throughput capabilities of cloud-hosted virtual machines, which allow attackers to launch massive attacks with far fewer devices. This shift is facilitated by several factors: VM botnets can be easier to establish than IoT botnets, as they don’t necessarily require widespread malware infections, since attackers can deploy them on cloud provider infrastructure anonymously using stolen payment details from data breaches or Magecart attacks. This trend points to the evolution of DDoS tactics, as attackers exploit both the processing power of VMs and anonymized access to cloud resources, enabling smaller, more efficient botnets capable of launching large-scale attacks without the complexities involved in infecting and managing fleets of IoT devices. [HEADING=1]How does Cloudflare help protect against DDoS attacks?[/HEADING] Cloudflare's Connectivity Cloud, built on our expansive anycast global network, plays a crucial role in defending against DDoS attacks by leveraging automated detection, traffic distribution, and rapid response capabilities. Here’s how it strengthens DDoS protection: Automated attack detection and mitigation: Cloudflare’s DDoS protection relies heavily on automation, using machine learning algorithms to identify suspicious traffic patterns in real time. By automating the detection process, Cloudflare can quickly recognize and block DDoS attacks without requiring manual intervention, which is critical in high-volume attacks that would overwhelm human responders. Global traffic distribution with IP anycast: Cloudflare's network spans over 330 cities worldwide, and DDoS traffic gets distributed across our multiple data centers. IP anycast allows us to distribute traffic across this global network, and this wide distribution helps absorb and mitigate large-scale attacks, as attack traffic is not directed towards a single point, reducing strain on individual servers and networks. Layered defense: Cloudflare’s Connectivity Cloud offers defense across multiple layers, including network (Layer 3), transport (Layer 4), and application (Layer 7). This layered approach allows for tailored defense strategies depending on the attack type, ensuring that even complex, multi-layered attacks can be mitigated effectively. Learn more about DDoS protection at layers 3, 4, and 7 in our DDoS protection documentation. Unmetered DDoS mitigation: Pioneering this approach since 2017 to ensure Internet security, Cloudflare provides unmetered DDoS protection, meaning customers are protected without worrying about bandwidth or cost limitations during attacks. This approach helps ensure that businesses, regardless of size or budget, can benefit from robust DDoS protection. Cloudflare’s distributed cloud infrastructure and advanced technology allows us to detect, absorb, and mitigate DDoS attacks in a way that is both scalable and responsive, avoiding downtime and maintaining service reliability, providing a robust solution to tackle the rising intensity and frequency of DDoS attacks compared to traditional options. Protecting against DDoS attacks is essential for organizations of every size. Although humans initiate these attacks, they’re carried out by bots, so effective defense requires automated tools to counter bot-driven threats. Real-time detection and mitigation should be as automated as possible, since relying solely on human intervention puts defenders at a disadvantage as attackers adapt to new barriers and can change attack vectors, traffic behavior, payload signatures, among others, creating an unpredicted scenario and thus rendering some manual configurations useless. Cloudflare’s automated systems continuously identify and block DDoS attacks on behalf of our customers, enabling tailored protection that meets individual needs. Source: https://blog.cloudflare.com/bigger-and-badder-how-ddos-attack-sizes-have-evolved-over-the-last-decade
-
25 Threads for jCodes
I’ve completed my part.
-
What’s the best way to deal with negative behavior from members?
The best way to handle negative members is to approach them with kindness and understanding initially. Address their concerns calmly and give them a chance to adjust their behavior. However, if a member continues to spread negativity despite 2-3 warnings, it’s important to take decisive action to protect the community. At that point, removing them from the community might be the best option to maintain a positive and welcoming environment for everyone else.
-
Community Chat Thread
Good evening! How’s Everything going for you?
-
How did you grow your community when you first started it?
It’s always a great idea to seed your forum with content before advertising or doing post exchanges—that’s one of my mottos too! The best part of owning a community is watching it grow before your eyes, going from 5 members to 20, and then to 100. It’s such an amazing feeling—there’s nothing greater! Every effort we put into growing our forums eventually pays off. It takes time and dedication to promote, but it’s always worth it in the end!
-
Your Niche
It’s a good niche, as there are a lot of people using AI. The market is still booming, and the industry isn’t going anywhere anytime soon. I think it was a great choice. There hasn’t been a forum in that particular niche, aside from Reddit, which is a great sign! Stay focused and keep striving!
-
XenForo 2.4 coming soon?
Xenforo has released its first “have you seen” post relating to 2.4. https://xenforo.com/community/threads/add-on-update-notifications.227134/ Once we roll out XenForo 2.4 here, resource authors will find that we have added a new field to resources in add-on categories that will allow you to fill in your add-on ID. When you install XenForo 2.4 on your own forums, your forum will now periodically call back to XF.com with a list of your installed add-ons and their versions and report back with the latest version available in the Resource Manager. Where there is an update available, as demonstrated above, this will be indicated in your add-ons list. Update checking can be enabled under "Basic board options". Subject to time constraints, we are looking to add the following additional functionality either before or shortly after the release of XenForo 2.4: Improved notifications Custom update checking URLs for developers to populate in their addon.jsonfile so if the canonical source of your add-on is on your own website, we can request version details from there rather than our resource manager One-click install of upgrades (for free add-ons)
-
What are you listening to?
I’m currently listening to Honesty by SBE Kp, [MEDIA=spotify]track:7KW9hHg7IVf9X3TsE2lATI[/MEDIA]
-
Elizabeth Warren calls for crackdown on Internet “monopoly”, Verisign you’ve never heard of
US Senator Elizabeth Warren of Massachusetts and Congressman Jerry Nadler of New York have called on government bodies to investigate what they allege is the “predatory pricing” of .com web addresses, the Internet’s prime real estate. In a letter delivered today to the Department of Justice and the National Telecommunications and Information Administration, a branch of the Department of Commerce that advises the president, the two Democrats accuse VeriSign, the company that administers the .com top-level domain, of abusing its market dominance to overcharge customers. In 2018, under the Donald Trump administration, the NTIA modified the terms onhow much VeriSign could charge for .com domains. The company has since hiked prices by 30 percent, the letter claims, though its service remains identical and could allegedly be provided far more cheaply by others. VeriSign is exploiting its monopoly power to charge millions of users excessive prices for registering a .com top-level domain,” the letter claims. “VeriSign hasn’t changed or improved its services; it has simply raised prices because it holds a government-ensured monopoly.” “We intend to respond to senator Warren and representative Nadler’s letter, which repeats inaccuracies and misleading statements that have been aggressively promoted by a small, self-interested group of domain-name investors for years,” said Verisign spokesperson David McGuire in a statement to WIRED. “We look forward to correcting the record and working with policymakers toward real solutions that benefit internet users.” In an August blog post entitled “Setting the Record Straight,” the company claimed that discourse around its management of .com had been “distorted by factual inaccuracies, a misunderstanding of core technical concepts, and misinterpretations regarding pricing, competition, and market dynamics in the domain name industry.” In the same blog post, the company argues that it is not operating a monopoly because there are 1,200 generic top-level domains operated by other entities, including .org, .shop, .ai, and .uk. Though far from a household name, VeriSign takes in about $1.5 billion in revenue each year for servicing its particular section of the Internet’s inscrutable plumbing. In their letter, Warren and Nadler allege that VeriSign has exploited its exclusive right to charge for highly sought-after .com addresses to juice its revenues and drive up its share price—all at the expense of customers for whom there is no viable alternative. The letter claims that separate agreements with the NTIA and Internet Corporation for Assigned Names and Numbers (ICANN), a nonprofit established by the Commerce Department to oversee the web’s domain name system, have allowed VeriSign to establish monopoly power. The former sets how much the company can charge its customers for registering .com addresses, while the latter assigns VeriSign as the “sole operator” of the .com domain. The letter also alleges that VeriSign might be in violation of the Sherman Act. The NTIA’s decision in 2018 to lift the price cap imposed on VeriSign also benefited ICANN, which in its role as overseer can reject price increases proposed by domain registry services. ICANN signed an agreement with VeriSign in 2020, sanctioning the maximum allowable price increases in return for $20 million over a five-year period. Thus, allege Warren and Nadler, “Verisign and ICANN may have a collusive relationship.” In June, a coalition of activist groups wrote to the DOJ and NTIA to express similar allegations. “ICANN and VeriSign function as a de facto cartel, and the NTIA should stop sanctioning the ‘incestuous legal triangle’ that serves as a shield to deflect overdue antitrust scrutiny into their otherwise likely illegal collusive relationship,” the coalition claims. The group urged the government to “stop this cycle of exploitation” by refusing to renew the relationship between the NTIA and VeriSign. Neither ICANN nor the NTIA responded immediately to requests for comment. The NTIA has since indicated that it will renew its agreement with VeriSign. However, the terms of that agreement are up for review on November 30, before the start of Trump’s second term, leaving the outgoing Democratic administration with an opportunity to put in place pricing rules that will apply for a six-year period, as a parting gift. In an August letter, the NTIA told VeriSign that it “had questions related to [the company's] pricing” and wanted to “discuss possible solutions.” VeriSign said it welcomed “an opportunity to have this important discussion.” But Warren and Nadler are now publicly pressing the NTIA to make sure that customers cannot be overcharged by VeriSign—and pressing the DOJ to review for potential antitrust violations, too. “Verisign has squeezed customers to enrich its investors while doing nothing to improve service,” they claim. “NTIA and DOJ should take action to ensure that over the next six years, VeriSign’s consumers are charged fair prices for .com registration Source: https://www.wired.com/story/elizabeth-warren-calls-for-crackdown-verisign/