• 沒有找到結果。

Future Works and Further Suggestions

5. Conclusions

5.3 Future Works and Further Suggestions

With the rapid growth of the Internet, an Internet library consisting of works on the Internet will play a more and more important role in knowledge dissemination and culture exchange and, software robots will inevitably become one of the most necessary and useful tools in organizing a successful Internet library. Consequently, identifying copyright authorization scope is quite essential, because this factor will actually decide the total amount of collections and, more importantly, the risk of copyright infringement litigations. Accompanying this trend, a vital problem is how to effectively and lawfully use the robots to complete their tasks. Compared to other measures, the Robots.txt and Robots Meta tags are the most commonly tools to help robots and webmasters to cooperate with each other to achieve this goal. Furthermore,

based on the cases, the Robots.txt and Robots Meta tags have been gradually evolving from merely voluntary advice to a set of potentially enforceable instruments which can be used to express a webmasters’ will and preference. In respect of this new function and some necessarily clarified uncertainties, a new version of the Robots.txt and Robots Meta tags proposed in this thesis will definitely play a more important role in the future Internet world, since they can take more serious responsibilities in the resolution of future disputes.

Also, a librarian experienced with acquiring online digital materials will understand the scope and implications of our method of implementing a CC license. As we mentioned above, the implementation method of the current CC licensing framework is too complex to be widely used. CCFE attaches the license data via the file name itself. Most search engines can allow users to limit their queries to CC license files without any changes to their existing software or systems. Finally, CCFE works on text and binary files, a feature simply not supported by the present CC license method.

However, as we have seen in the previous sections, one of the most obvious features of the CC license scheme is that the user has a chance to choose the jurisdiction. In fact, this feature of the CC license may give arise to some ambiguities in the context of Internet as most of the distributions on the Internet is cross borders.

For example, one works licensed under the US “by“ CC license may arise some doubts about further modifications in the civil law countries which do more respects of the author’s personality. There are two possible approaches to resolve this problem.

The first one is organizing a group, like iCommons, an international voluntary organization consisting of legal experts (Lessig, 2005b), to ensure each CC license in various jurisdictions following the same basic rules. However, there is no systematic study to examine the equivalence of the present licenses in different jurisdictions.

Another approach is to design a standard CC license, only including minimum rights embraced in the WIPO Copyright Treaty and other related international copyright conventions.

Moreover, keeping all the advantages and features of the Robots.txt and Robots Meta tags in mind, in the future, combining Robots.txt, Robots Meta tags and the CC license and other online licenses altogether to form a more powerful tool of implementation of online copyright which can not only used by webmasters, but can also by each author of copyrighted works within a website, such as the authors of video files on Youtube (Youtube, 2008) to express their complicated authorization scopes may form a new research direction. Moreover, apart from the copyright related subjects mentioned in this thesis, the great power of software robots now also arise some concerns about revealing personal privacy and data protection on the Internet (Thelwall and etc., 2006). Since the robots.txt and the Meta tags are only tools specifically designed for software robots, this tool may, hopefully, constitute a new regime in dealing with privacy and data protection issues.

References

1. ACM Digital Library (2009) Term of Usage: Digital Library, available at:

http://www.acm.org/publications/policies/use (accessed July 3, 2007).

2. Bailey, C. W. (2006), “What is open access?”, available at:

“http://www.digital-scholarship.org/cwb/WhatIsOA.pdf(accessed March 3, 2009).

3. BBC (2006), “BBC Creative Archive pilot has ended”, available at:

http://creativearchive.bbc.co.uk/news/archives/2006/09/hurry_while_sto.html (accessed July 3, 2007).

4. Barker, P. (2001), “Creating the Digital Library. A Special Report from the Primary Research Group: Book Review” The Electronic

Library, 19(3), 186-187.

5. Bolin, R. (2006), “Locking down the library: How copyright, contract, and cybertrapass block Internet archiving”, 33 Pepp. L. Rev. (2006) P761

6. Budapest Open Access Initiative (2002), "Budapest Open Access Initiative, 14 February 2002, available at:http://www.soros.org/openaccess/read.shtml (accessed March 3, 2009).

7. CcPublisher (2007), available athttp://wiki.creativecommons.org/ CcPublisher (accessed Dec. 3, 2007)

8. ClikZ Network (2007), “U.S. Search Engine Rankings, September 2007”, available athttp://www.clickz.com/3627655(accessed Dec. 3, 2007)

9. Chau, M. and Chen, H. (2003), “Personalized and focused Web spiders”, in:

Zhong, N., Liu, J., Yao, Y. (Ed.), Web intelligence, Springer-Verlag, pp.197-217.

10. Cheong, F.C., (1996), “Internet Agents: Spiders, Wanderers, Brokers, and Bots”, New Riders Publishing, Indiana, USA.

11. Citeseer.ist (1997) available at:http://citeseer.ist.psu.edu/(accessed July 3, 2007).

12. CiteseerX.ist (2009) “Submit Documents to CiteseerX” available at:

http://citeseerx.ist.psu.edu/submit (accessed Jan. 23rd, 2009).

13. Copiepresse (2007), “Copiepresse v. Google, Inc”., Copiepresse v. Google, Inc., No. 06/10.928/C (Feb. 2, 2007).

14. Copiepresse (2009), available at:http://www.copiepresse.be(accessed Feb 1st, 2009)

15. Conner, S. (1996), “An Extended Standard for Robot Exclusion”, available at:

http://www.conman.org/people/spc/robots2.html (accessed March 11, 2008)

16. Cornish, W. and Llewelyn, D. (2003a), “Intellectual Property: Patents, Copyright, Trade Marks and Allied Rights”, Sweet & Maxwell, London, PP475

17. Cornish, W. and Llewelyn, D. (2003a), “Intellectual Property: Patents, Copyright, Trade Marks and Allied Rights”, Sweet & Maxwell, London, PP485

18. Creative Commons (2009q), “XMP”, available at

http://wiki.creativecommons.org/XMP (accessed Feb. 3, 2009)

19. Digital Library Federation (1998), “A working definition of digital library”, available at:www.diglib.org/about/dldefinition.htm(accessed July 20, 2008) 20. Drott, M. C. (2002), “Indexing aids at corporate websites: the use of robots.txt

and META tags”, Information Processing and Management, Vol 38 No2, pp209-219.

21. eBay (2000), “eBay Inc. v. Bidder’s Edge, Inc.”, 100 F. Supp. 2d 1058 (N.D. Cal.

2000).

22. Feigin, Eric J. (2004), “Architecture of Consent: Internet Protocols and Their Legal Implications”, Stanford Law Review, Feb 2004, pp901-942.

23. Feldman, S. (2004), “Interview: Brewster Kahle”, June 2004 Queue, Volume 2 Issue 4 P24-33

24. Field (2006), “Field v. Goolgle, Inc.”, 412 F. Supp. 2d 1106 (D. Nev. 2006), available at:http://w2.eff.org/IP/blake_v_google/ google_nevada_order.pdf (accessed March 11, 2008).

25. Fielding, R. and Gettys, J. and Mogul, J. C. and Frystyk, H. and Masinter, L. and Leach, P.; Berners-Lee, T. (1999). "Hypertext Transfer Protocol - HTTP/1.1”, available athttp://www.w3.org/Protocols/rfc2616/rfc2616.html(accessed March 11, 2009).

26. Flickr (2004), available athttp://www.flickr.com/creativecommons/(accessed July 3, 2009)

27. Free Software Foundation (1999), “The Free Software Definition”, available at http://www.gnu.org/philosophy/free-sw.html(accessed July 3, 2008)

28. Free Software Foundation (2006), “Digital Restrictions Management and Treacherous Computing”, available athttp://www.fsf.org/campaigns/drm.html (accessed July 3, 2008)

29. Free Software Foundation (2007), “GNU General Public License Version 3”, available athttp://www.gnu.org/licenses/gpl-3.0.txt(accessed July 3, 2008) 30. Google (2008a), “How Google crawls my site?”, available at:

http://www.google.com/support/webmasters/bin/topic.py?topic=8843(accessed March 11, 2008)

31. Google (2008b), “Preventing content from appearing in Google search results”, available at:

http://www.google.com/support/webmasters/bin/topic.py?topic=8459(accessed March 11, 2008)

32. Google (2007c), “Google Advanced Search”, available at

http://www.google.com/advanced_search?hl=en(accessed July 3, 2007) 33. Gorman, G. E. (2006), “Giving way to Google”, Online Information Review,

Vol 30 Iss 2, pp97-99.

34. Hart, Michael S. (2004) “Gutenberg Mission Statement by Michael Hart”, available at:

http://www.gutenberg.org/wiki/Gutenberg:Project_Gutenberg_Mission_Stateme nt_by_Michael_Hart(accessed Jan. 3rd, 2009)

35. Hirtle, P. B. (2003), “Digital Preservation and Copyright”,available at:

http://fairuse.stanford.edu/commentary_and_analysis/2003_11_hirtle.html (accessed Jan. 3rd, 2009)

36. Hundie, K. (2003), “Library operations and Internet resources”, The Electronic Library, Vol 21 No 6 pg. 555-564

37. International DOI Foundation (2008), “Frequently Asked Questions about the DOI System: 1. What is a DOI® name?",available at

http://www.doi.org/faq.html#1(accessed March. 3rd, 2009)

38. Internet Archive (2009), available at:http://www.archive.org/index.php (accessed Jan. 3rd, 2009).

39. Issuu (2009a),available at:http://issuu.com(accessed Jan. 3rd, 2009).

40. Issuu (2009b), “Copyright FAQ”,available at:http://issuu.com/about/copyright (accessed Jan. 3rd, 2009).

41. Jones, P. (2001), “Open (source)ing the doors for contributor-run digital libraries”, Communications of the ACM Vol 44 Iss 5 pp 45-46.

42. Koster, M. (1993), “Guidelines for Robot Writers”, available at:

http://www.robotstxt.org/guidelines.html, (accessed March 11, 2008) 43. Koster, M. (1994),” A Standard for Robot Exclusion”, available at:

http://www.robotstxt.org/orig.html, (accessed March 11, 2008) 44. Koster, M. (1995), “Robot in the Web: threat or treat?”, available at:

http://www.robotstxt.org/wc/threat-or-treat.html, (accessed March 11, 2008) 45. Koster, M. (1996), “Evaluation of the standard for robots exclusion”, available at:

http://www.robotstxt.org/wc/eval.html, (accessed March 11, 2008)

46. Koster, M. (1997), “HTML Author’s Guide to the Robots Meta tags”, available at: http://www.robotstxt.org/wc/meta-user.html(accessed March 11, 2008) 47. LaMacchia, B. (2002), “Key Challenges in DRM: An Industry Perspective”,

Proceedings of the 2002 ACM Workshop on Digital Rights Management, Wellington. pp. 51-60

48. Lessig, L.(2004), “Free culture: how big media uses technology and the law to lock down culture and control creativity”, Penguin Press, New York.

49. Lessig, L.(2005a), “CC in Review: Lawrence Lessig on CC & Fair Use”, available at:http://creativecommons.org/weblog/entry/5681 (accessed March 11, 2009)

50. Lessig, L.(2005b), “CC in Review: Lawrence Lessig on iCommons”, available at:

http://creativecommons.org/weblog/entry/5700 (accessed March 11, 2009) 51. Lessig, L.(2006a), “Code: And Other Laws of Cyberspace, Version 2.0”, Basic

Books, New York. pp123

52. Lessig, L.(2006b), “Code: And Other Laws of Cyberspace, Version 2.0”, Basic Books, New York. pp124

53. Lin, Y-H, Ko, T-M, Chuang, T-R, Lin, K-J (2006) “Open Source Licenses and the Creative Commons Framework: License Selection and Comparison”, Journal of Information Science and Engineering, Vol 22 No 2 pp1-17.

54. Lopatin, L. (2006), “Library digitization projects, issues and guidelines :A survey of the literature”, Library Hi Tech, Vol 24 No 2, pp273

55. McCray, A. T. and Gallagher, M. E. (2001), “Principles for digital library development”, Communication of ACM, Vol 44 Iss 5 , pp48-54

56. Millard, C. (2007), “Copyright In Information Technology and Data”, in: Reed, C. and Angel , J. (Ed.), Computer Law, Oxford University Press, pp.337-396.

57. MSN (2008a), “control which pages of your website are indexed”, available at:

http://search.msn.com/docs/siteowner.aspx?t=SEARCH_WEBMASTER_REF_

RestrictAccessToSite.htm(accessed March 11, 2009) 58. MSN (2008b), “Limit crawl frequency”, available at:

http://search.msn.com/docs/siteowner.aspx?t=SEARCH_WEBMASTER_REF_

RestrictAccessToSite.htm(accessed March 11, 2009)

59. National Science Foundation (1999), “Digital Libraries Initiative: Available Research, US Federal Government”, available at:http://dli2.nsf.gov/dlione/.

(accessed March 11, 2008)

60. Netcraft (2008), “Netcraft Web Server Survey”, available at:

http://news.netcraft.com/archives/web_server_survey.html(accessed July 9, 2008)

61. ODRL Initiative (2005), “ODRL Creative Commons Profile”, available at:

http://odrl.net/Profiles/CC/SPEC.html(accessed March 9, 2009)

62. ODRL Initiative (2009), “The international effort to develop and promote ODRL”, available at:http://odrl.net/(accessed March 9, 2009)

63. Open Content Alliance (2009) “Open Content Alliance: FAQ”, available at:

http://www.opencontentalliance.org/faq/(accessed March 13, 2009)

64. Open Source Initiative (2006), “The BSD License”, available at:

http://www.opensource.org/licenses/bsd-license.php(accessed March 9, 2009) 65. O'Leary, Mick (2009), “Open Content Alliance Embodies Open Source

Movement”, Information Today, Vol 26 Iss 1, pp 37-38

66. Rao, S.S. (2003), “Copyright: its implications for electronic information”, Online Information Review, Vol 27 Iss 4, pp. 264-275.

67. Raghavan, S. and Garcia-Molina, H. (2001), “Crawling the Hidden Web”, Proceedings of the 27th International Conference on Very Large Data Bases (VLDB), Sep. 11-14, 2001, Rome, Italy, available at:

http://dbpubs.stanford.edu:8090/pub/2000-36(accessed July 3, 2007).

68. Rao, S.S. (2003), “Copyright: its implications for electronic information”, Online Information Review, Vol 27 Iss 4, pp. 264-275.

69. Reed, C. (2004), “Internet law: Text and Materials”, Cambridge University Press, Cambridge, pp71.

70. RFC1738 (1994), available atftp://ftp.rfc-editor.org/in-notes/rfc1738.txt (accessed July 3, 2007)

71. Rosenblatt, B. (1997). "The Digital Object Identifier: Solving the Dilemma of Copyright Protection Online", Journal of Electronic Publishing, Vol 3, Iss. 2, pp135-156.

72. Samuelson, P. (2003), “Unsolicited Communications as Trespass?”, Communication of ACM, Vol 46 No 10, pp15-20.

73. Scribd (2009) available at:http://www.scribd.com/(accessed Jan. 3rd, 2009).

74. Seadle, Michael. (2006), “Copyright in the networked world: using facts”, Library Hi Tech, Vol 24 No 3, pp. 463-468

75. Seadle, Mi and Greifeneder, E.(2007), ”Defining a digital library”, Library Hi Tech Vol. 25 No. 2, 2007 pp. 169-173

76. Sieman, J. S. (2007), “Using the implied license to inject common sense into digital copyright”, North Carolina Law Review, Vol 85, pp885-930.

77. Smith, G (2007), “Copiepresse v Google - the Belgian judgment dissected”, available at:

http://www.birdbird.com/english/publications/articles/Copiepresse-v-Google.cfm

?RenderForPrint=1 (accessed July 2, 2008)

78. Snyder, H., and Rosenbaum, H. (1998) “How Public is the Web? Robots, Access, and Scholarly Communication”, Proceedings of the 61stAnnual Meeting of the American Society for Information Science, vol. 35, pp 453-462.

79. Spinello, Richard A. (2007), “Intellectual property rights”, Library Hi Tech, Vol 25 No 1, pp. 12-22

80. Sterling, J.A.L. (2003a), “World Copyright Law”, Sweet & Maxwell, London,

pp530

81. Sterling, J.A.L. (2003b), “World Copyright Law”, Sweet & Maxwell, London, pp533

82. Sterling, J.A.L. (2003c), “World Copyright Law”, Sweet & Maxwell, London, pp17

83. Sterling, J.A.L. (2003d), “World Copyright Law”, Sweet & Maxwell, London, pp337

84. Sterling, J.A.L. (2003e), “World Copyright Law”, Sweet & Maxwell, London, pp531

85. Sutter, G. (2007), “Online Intermediaries”, in: Reed, C. and Angel , J. (Ed.), Computer Law, Oxford University Press, pp.233-282.

86. Tan, P-N., and Kumar, V. (2002) “Discovery of web robot sessions based on their navigational patterns”, Data Mining and Knowledge Discovery, Vol 6 No 1, pp9-35.

87. Thelwall, M. and Stuart, D. (2006), “Web crawling ethics revisited: Cost, privacy, and denial of service”, Journal of the American Society for Information Science and Technology, Vol 57 No 13, pp.1771-1779.

88. Wikipedia (2001), “GNU Free Documentation License”, available at

http://en.wikipedia.org/wiki/GNU_Free_Documentation_License (accessed July 3, 2007)

89. Wikipedia (2007a), “Comparison of file systems”, available at

http://en.wikipedia.org/wiki/Comparison_of_file_systems(accessed Dec. 3, 2007)

90. Wikipedia (2007b), “Resource Description Framework”, available at

http://en.wikipedia.org/wiki/Resource_Description_Framework (accessed Dec. 3, 2007)

91. Yahoo (2007), “Creative Commons Search”, available at http://search.yahoo.com/cc(accessed July 3, 2007)

92. Yahoo (2008a), “Yahoo! Slurp—Yahoo!'s Web Crawler”, available at:

http://help.yahoo.com/l/us/yahoo/search/webcrawler/(accessed March 11, 2008) 93. Yahoo (2008b), “How do I prevent my site or certain subdirectories from being

crawled?”, available at:

http://help.yahoo.com/l/us/yahoo/search/webcrawler/slurp-02.html(accessed March 11, 2008)

94. Yahoo (2008c), “How do I keep my page from being cached in Yahoo! Search?”, available at:http://help.yahoo.com/l/us/yahoo/search/webcrawler/slurp-05.html (accessed March 11, 2008)

95. Yahoo (2008d), “How can I reduce the number of requests you make on my web

site?”, available at:

http://help.yahoo.com/l/us/yahoo/search/webCrawler/slurp-03.html (accessed March 11, 2009)