gfxgfx
 
Please login or register.

Login with username, password and session length
 
gfx gfx
gfx
76784 Posts in 13501 Topics by 1651 Members - Latest Member: Arnold99 September 01, 2024, 01:07:03 am
*
gfx*gfx
gfx
WinMX World :: Forum  |  Discussion  |  WinMx World News  |  ISPs torch UK.gov's smut-blocking master plan
gfx
gfxgfx
 

Author Topic: ISPs torch UK.gov's smut-blocking master plan  (Read 2583 times)

0 Members and 1 Guest are viewing this topic.

ISPs torch UK.gov's smut-blocking master plan
« on: April 22, 2012, 09:42:48 pm »
Quote
“Forcing ISPs to filter adult content at the network level, which users would then have to opt out of, is neither the most effective nor most appropriate way to prevent access to inappropriate material online," retorted ISPA secretary general Nicholas Lansman.

http://www.theregister.co.uk/2012/04/19/ispa_criticises_smut_blocking_plan/

Offline Plum

  • Core
  • *****
  • ***
  • I love WinMX!
Re: ISPs torch UK.gov's smut-blocking master plan
« Reply #1 on: April 23, 2012, 02:58:57 am »
While I am mostly against smut, I am also for freedom, and forcing blocking at a network level just puts more work on people and takes up more resources.  I would love to see smut filters within the new client, but that would be a user preference, not anything controlled by the government nor web interests.

Offline White Stripes

  • Core
  • *****
  • ***
  • Je suis aimé
Re: ISPs torch UK.gov's smut-blocking master plan
« Reply #2 on: April 23, 2012, 03:10:09 am »
Quote
I would love to see smut filters within the new client,

not possible... just ask limewire about content filters...

Quote
but that would be a user preference

use oldschool search operators such as 'thing im looking for -smut' ... but that would only take out things with 'smut' in their file and pathnames... would need to include too many words for it to be effective... would be like searching for 'la blue girl -tentacle'... results would still get through (pun intended)....

Offline Plum

  • Core
  • *****
  • ***
  • I love WinMX!
Re: ISPs torch UK.gov's smut-blocking master plan
« Reply #3 on: April 23, 2012, 03:32:39 am »
There is no liability to having porn filters hardwired in the code.  It has nothing to do with Limewire.  Nearly all the clients have some sort of built in porn blocking for those who want to turn it on.  That has nothing to do with liability over blocking copyright content.  All a porn filter has to do is look at the name for objectionable terms.  And the wisest way to make it would be user configurable.

Offline White Stripes

  • Core
  • *****
  • ***
  • Je suis aimé
Re: ISPs torch UK.gov's smut-blocking master plan
« Reply #4 on: April 23, 2012, 03:53:57 am »
Quote
....has to do is look at the name for objectionable terms....

replace 'objectionable' with 'copy-written' and you may get the idea i was trying to convey...

Offline Plum

  • Core
  • *****
  • ***
  • I love WinMX!
Re: ISPs torch UK.gov's smut-blocking master plan
« Reply #5 on: April 23, 2012, 05:30:49 am »
Wrong.  You cannot know in advance what copyrighted stuff is.  You cannot know that by the name.  But you can know what is intended to be obscene by name.  There is nothing wrong with end user filters, and it has nothing to do with copyright.  You don't even seem to know what kind of filters I am talking about, you are just prejudiced against the term itself, not realizing these are 2 separate and unrelated technologies.

Re: ISPs torch UK.gov's smut-blocking master plan
« Reply #6 on: April 23, 2012, 06:49:09 am »
The cartel and authorities couldn't 'know' that anything on the megaupload servers was copyright, now lots of people with legitimate material have lost out.

all that aside, iirc you can exclude terms from the search parameter in winmx

Offline GhostShip

  • Ret. WinMX Special Forces
  • WMW Team
  • *****
Re: ISPs torch UK.gov's smut-blocking master plan
« Reply #7 on: April 23, 2012, 01:27:19 pm »
Winzo has a list of blocked terms, I do think it might be a good idea to have such a filter embedded with a user override in the code but this seems a case of your damned if you do and your damned if you dont as to whether its on or off by default, people will critisize whatever action is taken, thus the safest path is to have no filters from my perspective in terms of a peaceful life, I think this is something folks will have to be herded together to form a concensus on.

Offline sharing sarah

  • Forum Member
  • A Music & Movie Spectrum_F135E44219F6
Re: ISPs torch UK.gov's smut-blocking master plan
« Reply #8 on: April 23, 2012, 02:40:46 pm »
Although I agree with GhostShip about if you do or don't have it on by default would it be possible to have that as a plugin at a later date?
That way users could choose to add it or not.

Offline White Stripes

  • Core
  • *****
  • ***
  • Je suis aimé
Re: ISPs torch UK.gov's smut-blocking master plan
« Reply #9 on: April 23, 2012, 04:32:03 pm »
my worry is simple actually... the kiddy porn loving cartel will see that the new client can filter (regardless of what it is being filtered) and try moving in with a lawsuit to have the filter expanded... a case that would fail, granted, but possible court costs, seized domains or other underhanded tactics would definitely be the final nail in the coffin....  ... make sense now?

users who dont want to see it can use the search operators (that have always been there)... that way the user gets the results they want without a 'slippery slope' filter being added... ...at least its just filenames and not thumbnails... yeesh...

and in a more practical view the porn filter would actually hit false positives... good example of such? http://www.jamendo.com/en/artist/pornophonique ... creative commons release that has no smut whatsoever... just an odd name... lol

Offline GhostShip

  • Ret. WinMX Special Forces
  • WMW Team
  • *****
Re: ISPs torch UK.gov's smut-blocking master plan
« Reply #10 on: April 23, 2012, 08:35:59 pm »
As Plum does state correctly Stripes filtering locally is not the same as filtering traffic across the network and thus I dont see how any local filter can be expanded to cope with every name in the world that may or may not be copyrighted material.

I also dont see how anyone can police what folks are filtering at home as theres no way anyone iinvolved in the developing group will add any such network filters as to do so would involve breaking the law regarding folks privacy and also taking control of and policing the network, something they refuse to take on, no money is changing hands here and as long as thats the case the cartel can only dream of making speculative claims,  surely the scum at the cartel can continue their fraudulant activies aimed at consumers instead of playing make believe with a judge and lets be honest here a UK judge will not entertain such buffoonery.

Offline White Stripes

  • Core
  • *****
  • ***
  • Je suis aimé
Re: ISPs torch UK.gov's smut-blocking master plan
« Reply #11 on: April 23, 2012, 09:07:04 pm »
if ppl want a filter ok... just as long as its blank on new installs and theres no way to change whats in that filter by remote in any way...

just a simple CSV box of terms to filter in settings somewhere...

last thing this app needs is more attention by .... undesirables

Offline GhostShip

  • Ret. WinMX Special Forces
  • WMW Team
  • *****
Re: ISPs torch UK.gov's smut-blocking master plan
« Reply #12 on: April 23, 2012, 10:08:30 pm »
There will be no remote activity bar the those WinMX folks allow such as the blocklist of network parasites, and thats a one way process, even client updates are never going to be automatic as its a potential issue following the shareaza site hijack by anti p2p criminals, this network is wholly owned by its users, if anyone forgets that they will face an angry mob.

PS: Just had a thought that maybe folks can tick a box in the installer to allow or disallow adult word blocking, that will simply turn on or off the adult terms hard-coded in the client that should suit most folks and of course a place in the settings can be added to adjust that if folks want to change their minds.

Offline White Stripes

  • Core
  • *****
  • ***
  • Je suis aimé
Re: ISPs torch UK.gov's smut-blocking master plan
« Reply #13 on: April 23, 2012, 10:20:08 pm »
er... still dont like the idea of hardcoding a list of any sort.. ..maybe a 'fill with suggestions' button that puts data in the filter list for the end user to go over and remove from/add to...

hardcoding cant be removed from/added to... (like the cache servers in the elder client)

Offline GhostShip

  • Ret. WinMX Special Forces
  • WMW Team
  • *****
Re: ISPs torch UK.gov's smut-blocking master plan
« Reply #14 on: April 23, 2012, 10:25:10 pm »
Hmm well nothings set in stone, I was just tossing some ideas out there and as I say I really do think the folks who care about such things need to thrash the topic out amongst themselves as the developers are only going to follow the will of the folks in any such avenues, if no consensus arrives it will simply be left out.

Offline Plum

  • Core
  • *****
  • ***
  • I love WinMX!
Re: ISPs torch UK.gov's smut-blocking master plan
« Reply #15 on: April 26, 2012, 06:48:34 am »
if ppl want a filter ok... just as long as its blank on new installs and theres no way to change whats in that filter by remote in any way...

just a simple CSV box of terms to filter in settings somewhere...

last thing this app needs is more attention by .... undesirables

Well, all you have to do is put in end user filters, no network filters nor hard-coded ones.  File-sharing is about freedom, and freedom works both ways.  Freedom to find content and freedom from content.  Freedom to be trusted to find what you want to find, and freedom at an individual level to where the user won't be forced to see what they don't want to.  I use porn-blocking in the various clients, but not because I oppose porn.  Since I am not looking for porn in my searches, it is most likely fake or spurious.  Other clients offer a check mark to tag if you don't want adult results.  That list is hard-coded, but they also have a user configurable or per search filter.  We already have file type, network speed, sampling rate, and size filters, so a few more options wouldn't hurt.  For a while, most of the WinMX traffic from users was refreshes, which indicated flooding, and a few extra options could have reduced the spurious traffic, thus reducing the bandwidth from legitimate users.  There was both the flooding and then the rampant network activity from the users trying different things or hitting search again to reduce the results on the screen before hitting stop (to reduce CPU usage).  So the abuse and the reaction to the abuse degraded the network.

That is one more reason why I think the search code should be both server side and local - put a query out to the network, and then include code to check that what you requested is what you searched for.  So the query is applied both to the network to get the results and again against the results before displaying them.  That last stage could be disabled by default (to make sure meta-results are returned) but there in case network abuse/spam outweighs the value of searching for metadata.  That was also why I proposed 2 search streams, one for file and folder names, and the other for ID3/EXIFF/WMF style metadata.  Then the double filter could be applied only to the name stream results, while the metadata results stream is trusted more blindly (since there is no way to verify that on the client end without downloading at least the metadata portion of the files first), with an option to turn off the metadata stream (incoming only) if it is abused/compromised.

We as humans don't instantly trust what we are told or given (well, we shouldn't).  If someone offers you a candy bar, you probably won't eat it without smelling it to make sure it isn't dog mess, and if a customer hands you a large bill, you get out the counterfeit pen and maybe give the note a quick inspection before accepting it.  Someone tried cashing a $1000 bill with President Obama's picture on it.  Needless to say, they were not allowed to use it.  So why should we fully trust the super-peers/primaries and what they return without inspecting it?  Stuff that doesn't match what was searched or comes from a location that shouldn't technically exist, or stuff from a peer/primary which gives hits for *every* request should be automatically suspect.  Likewise, if porn comes up in every search you run when the terms were not pornographic, then the nodes/primaries in question should be considered suspect.

As for undesirables, discussion about features should be done anonymously, and nobody should take credit for the code.  Keep all nodes and resources in countries with few or no treaties.  And even set it to where the cache nodes are configurable.  Why edit the hosts file or apply a patch to the next outage or compromise?  Make it to where the end user can connect to new host cache servers.  Even put prominent notices in the software not to use it illegally and put all the cartel-type notices and warnings.  That could be in the form of a splash screen that is enabled by default (and easily disabled).

What I don't want to see are anti-proxy filters applied across the entire network.  If I want to use file-sharing through a proxy, Tor node, or VPN, I should have the right to protect myself in such a manner.  Now if the abusers use such, then the users should have options to block traffic from them.

Offline Plum

  • Core
  • *****
  • ***
  • I love WinMX!
Re: ISPs torch UK.gov's smut-blocking master plan
« Reply #16 on: April 26, 2012, 07:21:01 am »
There will be no remote activity bar the those WinMX folks allow such as the blocklist of network parasites, and thats a one way process, even client updates are never going to be automatic as its a potential issue following the shareaza site hijack by anti p2p criminals, this network is wholly owned by its users, if anyone forgets that they will face an angry mob.

PS: Just had a thought that maybe folks can tick a box in the installer to allow or disallow adult word blocking, that will simply turn on or off the adult terms hard-coded in the client that should suit most folks and of course a place in the settings can be added to adjust that if folks want to change their minds.

Well, you could include a manual update check in the code, with a configurable server location.  I am well aware of the domain hijacking of Shareaza and other bumps in the road, like the transition from the relatively secure and stable Kazaa network (even though Kazaa itself was the most spyware-ridden application there was).  The hijack was frustrating, since I reinstalled it and noticed it begging for money, and I couldn't find anything.  I searched to see if there were fake versions and then I found the legitimate team moved to Source Forge.  I removed the fake client and installed the one available there, and it was much closer to what I was used to using.  That is one of the dangers of using the open source model for this sort of code.  Anyone who codes could easily rewrite the client for malicious use and pass it off as the original.  Fake files has been around since even BBS days under MS-DOS.  You'd be on a BBS and find fake shareware, like a trojan/virus passing itself off as a shareware utility like PKZIP.  PKWare put out notices that they never released certain purported versions and warned not to download them.

Yes, Shareaza has a tick box for adult content. On a search page, you have the search box and type selection on the left, and a filter box in the lower right.  One strategy I use is to search for half the term in the top left box, and filter the rest in the lower box.  That lower right box also has an advanced page where you can tick or untick for adult content, suspected malware, suspected bogus results, size filters, and results that don't match the search.

Even Gnucleus has similar or did the last time I used it.  It got to where it refused to connect, so I switched to other Gnutella clients at the time. I liked Gnucleus' GnucDNA or whatever they called it.  It was their DLL library, and any client was welcomed to use it.  That way, everyone could be united on the same network, even with different client features, and developers would have more time to design the interface without messing with the protocol code (unless they wanted to add more functionality).  Plus GnucDNA had the advantage of making whatever client easily upgraded.  The Gnutella 2 protocol was a part of that innovation, since G2 was scaleable and allowed custom packets to support whatever features someone wanted to put in a client (with the other clients ignoring incompatible packets).  So you had both unity and diversity on the network.  The unity part made more files available, and the diversity part allowed you to use whatever client that had the features you liked.

Offline White Stripes

  • Core
  • *****
  • ***
  • Je suis aimé
Re: ISPs torch UK.gov's smut-blocking master plan
« Reply #17 on: April 26, 2012, 03:17:24 pm »
Quote
That is one more reason why I think the search code should be both server side and local

um.. quick note... there is no server side... winmx is both server and client....

Offline Plum

  • Core
  • *****
  • ***
  • I love WinMX!
Re: ISPs torch UK.gov's smut-blocking master plan
« Reply #18 on: July 05, 2012, 10:51:17 pm »
Quote
That is one more reason why I think the search code should be both server side and local

um.. quick note... there is no server side... winmx is both server and client....
You misunderstood me and my own personal jargon which you could have figured out from the context of how I was using it.  There is an interface part of the code (what the user deals with) and the network part of the code which deals with the other machines.  "Server side" is my own personal jargon for "server and client side," or a synonym with "network side."  I was obviously speaking of the network layer vs. the user layer of the code in the same program.  Also "local side" as I used it means local to the user.  We should all be able to read between lines and immediately see others' private uses of terms and symbolism by the context.

Lets try again.  There are 2 sides or layers to EVERY file-sharing software.  There is the network side (what I call "server side") and the user inferface (what I called the "local side").  When you do a search in nearly all file clients, you are giving the network the full responsibility for the search.  The user interface has no role other than dispatching it.  That is very irresponsible coding if you ask me, regardless of the project or team doing it.  If you were running a business and someone handed you a $100 bill, would you not at least look at it and maybe use a detector pen to see if it indeed really is a $100 bill?  Even if you were running an illegal business, you wouldn't let your clients get by with giving you fake money.  Searching, by its very nature is a type of filtering.  Now, what if the user side also takes a role in this as a safety net and complements the network side of the program?  So the user side asks the network side to search for something.  Then as the hits come in, the user side evaluates whether that was what it requested and discards what does not match.  I already covered the challenges related to this.  Pathname hits would be no problem, since the entire path is returned to the software, and are hidden from the user unless they unhide them.  So they would be available for the user interface end to judge the original search term against.  The challenge would be metadata hits, and I suggested a separate stream in the protocol for them and the option to disable it (when under attack).  I think maybe you can understand me now, and my call for both the user interface code and the network code to be involved in searches.

Lets could cover the implications of my modification.  Lets suppose we are under attack with garbage because there is a custom client causing a search poisoning attack.  Now, yes, the searches would be much slower than if there was no attack in progress, but with my enhancement of getting the user interface to validate the results before displaying them, only seemingly valid results would be returned.  My method might require considerably more CPU power during an attack.  This would not help Gnutella style attacks where new files are created on the fly to match the results.  That is a serious vulnerability to overcome.  But with my enhancement, if you are getting hit with non-matching files, this would help stop it.  It should not be relied upon as the only means to stop that scenario, since there are protocol vulnerabilities which make it possible.  It would also help reduce the additional traffic created by users constantly leaving and joining the network, and users constantly repeating their searches.  So the signal to noise ratio would be much better.  Related to this would be the idea of not relaying any hits to other machines if the machine detects invalid hits.  If it is not good enough for one machine, then don't amplify the work needed for others to deal with it.

Sure, we could try other enhancements.  To deal with the file-spawning problem mentioned above, the software could search with one less character in one of the search terms, which might make a search broader.  But, the user interface logic again would compare the entire string against what is returned.  So my user interface modification above can be further modified to deal with Gnutella style attacks too, just by not only comparing the search term to the hits, but also by automatically searching with one less character (at least when the search terms are long enough to do this without greatly increasing network load).

I can think of more sophisticated solutions, but I see potential problems with those.  Reputation-based or snitch-based enhancements might be used in reverse to cause DDOS attacks because the good machines would censor themselves and the bad machines would have the highest reputation.  So only the spammers and flooders would have a voice.  Journaling types of approaches could set a bad precedent down the road.  Producing more information to use forensically is not a good thing for us.

What I mention here should be used as an adjunct to a more secure protocol and as future-proofing, not as a standalone solution.  They would be a 2nd line of defense.

Offline DaBees-Knees

  • WMW Team
  • *****
Re: ISPs torch UK.gov's smut-blocking master plan
« Reply #19 on: July 06, 2012, 07:17:44 am »
Text based filters can never work effectively due to the complexity of language and the multiple meanings of words. An example that I came across when using the words "Root Canal", as in dental treatment. Check out what root is understood to mean in South Africa.

WinMX World :: Forum  |  Discussion  |  WinMx World News  |  ISPs torch UK.gov's smut-blocking master plan
 

gfxgfx
gfx
©2005-2024 WinMXWorld.com. All Rights Reserved.
SMF 2.0.19 | SMF © 2021, Simple Machines | Terms and Policies
Page created in 0.007 seconds with 18 queries.
Helios Multi © Bloc
gfx
Powered by MySQL Powered by PHP Valid XHTML 1.0! Valid CSS!