<samp id="e4iaa"><tbody id="e4iaa"></tbody></samp>
<ul id="e4iaa"></ul>
<blockquote id="e4iaa"><tfoot id="e4iaa"></tfoot></blockquote>
    • <samp id="e4iaa"><tbody id="e4iaa"></tbody></samp>
      <ul id="e4iaa"></ul>
      <samp id="e4iaa"><tbody id="e4iaa"></tbody></samp><ul id="e4iaa"></ul>
      <ul id="e4iaa"></ul>
      <th id="e4iaa"><menu id="e4iaa"></menu></th>

      COMP3310代做、代寫C++, Java/Python編程

      時間:2024-04-16  來源:  作者: 我要糾錯



      Page 1 of 3
      COMP3310 - Assignment 2: Indexing a Gopher.
      Background:
      • This assignment is worth 12.5% of the final mark.
      • It is due by 23:55 Friday 26 April AEST (end of Week 8)
      • Late submissions will not be accepted, except in special circumstances.
      o Extensions must be requested as early as possible before the due date, with suitable
      evidence or justification.
      • If you would like feedback on particular aspects of your submission, please note that in the
      README file within your submission.
      This is a coding assignment, to enhance and check your network programming skills. The main focus is on
      native socket programming, and your ability to understand and implement the key elements of an
      application protocol from its RFC specification.
      Please note that this is an ongoing experiment for the course, trialling gopher for this assignment. We may
      discover some additional challenges as we go, that requires some adjustments to the assignment activities, or
      a swap of server. Any adjustments will be noted via a forum Announcement.
      Assignment 2 outline
      An Internet Gopher server was one of the precursors to the web, combining a simple query/response
      protocol with a reasonably flexible content server, and a basic model for referencing and describing
      resources on different machines. The name comes from the (Americanised) idea to “go-for” some content…
      and also the complexity of their interconnected burrows1
      .
      For this assignment, you need to write your own gopher client in C, Java or Python2,3
      , without the use of any
      external gopher-related libraries. The client will need to ‘spider’ or ‘crawl’ or ‘index’ a specified server, do
      some simple analysis and reporting of what resources are there, as well as detect, report and deal with any
      issues with the server or its content.
      Your code MUST open sockets in the standard socket() API way, as per the tutorial exercises. Your code
      MUST make appropriate and correctly-formed gopher requests on its own, and capture/interpret the results
      on its own. You will be handcrafting gopher protocol packets, so you’ll need to understand the structures of
      requests/responses as per the gopher RFC 1436.
      We will provide a gopher server to run against, with a mix of content – text and binary files, across some
      folder structure, along with various pointers to resources.
      In the meantime, you SHOULD install a gopher server on your computer for local access, debugging and
      wiresharking. There are a number available, with pygopherd perhaps the more recently updated but more
      complex, and Motsognir, which is a bit older but simpler. If you find another good one, please share on the
      forum.
      1 https://en.wikipedia.org/wiki/Gopher
      2 As most high-performance networking servers, and kernel networking modules, are written in C with other languages
      a distant second, it is worth learning it. But, time is short, and everyone has a different background.
      3
      If you want to use another language (outside of C/Java/Python), discuss with your tutor – it has to have native socket
      access, and somebody on the tutoring team has to be able to mark it.
      Page 2 of 3
      Wireshark will be very helpful for debugging purposes. A common trap is not getting your line-ending right on
      requests, and this is rather OS and language-specific. Remember to be conservative in what you send and
      reasonably liberal in what you accept.
      What your successful and highly-rated indexing client will need to do:
      1. Connect to the class gopher server, and get the initial response.
      a. Wireshark (just) this initial-response conversation in both directions, from the starting TCP
      connection to its closing, and include that wireshark summary in your README.
      b. The class gopher site is not yet fully operational, an announcement will be made when it’s ready.
      2. Starting with the initial response, automatically scan through the directories on the server, following links
      to any other directories on the same server, and download any text and binary (non-text) files you find.
      The downloading allows you to measure the file characteristics. Keep scanning till you run out of
      references to visit. Note that there will be items linked more than once, so beware of getting stuck in a
      loop.
      3. While running, prints to STDOUT:
      a. The timestamp (time of day) of each request, with
      b. The client-request you are sending. This is good for debugging and checking if something gets
      stuck somewhere, especially when dealing with a remote server.
      4. Count, possibly store, and (at the end of the run) print out:
      a. The number of Gopher directories on the server.
      b. The number, and a list of all simple text files (full path)
      c. The number, and a list of all binary (i.e. non-text) files (full path)
      d. The contents of the smallest text file.
      e. The size of the largest text file.
      f. The size of the smallest and the largest binary files.
      g. The number of unique invalid references (those with an “error” type)
      h. A list of external servers (those on a different host and/or port) that were referenced, and
      whether or not they were "up" (i.e. whether they accepted a connection on the specified port).
      i. You should only connect to each external server (host+port combination) once. Don't
      crawl their contents! We only need to know if they're "up" or not.
      i. Any references that have “issues/errors”, that your code needs to explicitly deal with.
      Requests that return errors, or that had to abort (e.g. due to a timeout, or for any other reason) do not count
      towards the number of (smallest/largest)(text/binary) files.
      You will need to keep an eye on your client while it runs, as some items might be a little challenging if you’re
      not careful… Not every server provides perfectly formed replies, nor in a timely fashion, nor properly
      terminated file transfers, for example. Identify any such situations you find on the gopher server in your
      README or code comments, and how you dealt with each of them – being reasonably liberal in what you
      accept and can interpret, or flagging what you cannot accept.
      We will test your code against the specified gopher, and check its outputs. If you have any uncertainties
      about how to count some things, you can ask your tutor or in the forum. In general, if you explain in your
      README how you decide to count things and handle edge-cases, that will be fine.
      You can make your crawler's output pretty or add additional information if you'd like, but don't go
      overboard. We need to be able to easily see everything that's listed here.
      Page 3 of 3
      Submission and Assessment
      There are a number of existing gopher clients, servers and libraries out there, many of them with source.
      While perhaps educational for you, the assessors know they exist and they will be checking your code against
      them, and against other submissions from this class.
      You need to submit your source code, and a README file (text/word/pdf). Any instructions to run the code,
      and any additional comments and insights, please provide those in the README. Your submission must be a
      zip file, packaging everything as needed, and submitted through the appropriate link on wattle.
      Your code will be assessed on [with marks% available]
      1. Output correctness [45%]
      o Does the gopher server correctly respond to all of your queries?
      o Does your code report the right numbers? (within your interpretation, perhaps)
      o Does your code cope well with issues it encounters?
      o Does your code provide the running log of requests as above?
      2. Performance [10%]
      o A great indexer should run as fast as the server allows, and not consume vast amounts of
      memory, nor take a very long time. There won’t be too many resources on the server.
      3. Code “correctness, clarity, and style” [45%]
      o Use of native sockets, writing own gopher requests correctly.
      o Documentation, i.e. comments in the code and the README - how easily can somebody else
      pick this code up and, say, modify it.
      o How easy the code is to run, using a standard desktop environment.
      o How does it neatly handle edge-cases, where the server may not be responding perfectly.
      During marking your tutor may ask you to explain some particular coding decisions.
      Reminder: Wireshark is very helpful to check behaviours of your code by comparing against existing gopher
      clients (some are preinstalled in Linux distributions, or are easily added). There are a number of youtube
      videos on gopher as well that e.g. show how the clients work. Your tutors can help you with advice (direct or
      via the forum) as can fellow students. It’s fine to work in groups, but your submission has to be entirely your
      own work.

      請加QQ:99515681  郵箱:99515681@qq.com   WX:codinghelp

      標簽:

      掃一掃在手機打開當(dāng)前頁
    • 上一篇:代做COMP9024、代寫C++設(shè)計編程
    • 下一篇:代寫CS360、代做Java/Python程序設(shè)計
    • 無相關(guān)信息
      昆明生活資訊

      昆明圖文信息
      蝴蝶泉(4A)-大理旅游
      蝴蝶泉(4A)-大理旅游
      油炸竹蟲
      油炸竹蟲
      酸筍煮魚(雞)
      酸筍煮魚(雞)
      竹筒飯
      竹筒飯
      香茅草烤魚
      香茅草烤魚
      檸檬烤魚
      檸檬烤魚
      昆明西山國家級風(fēng)景名勝區(qū)
      昆明西山國家級風(fēng)景名勝區(qū)
      昆明旅游索道攻略
      昆明旅游索道攻略
    • 幣安官網(wǎng)下載 福建中專招生網(wǎng) NBA直播 WPS下載

      關(guān)于我們 | 打賞支持 | 廣告服務(wù) | 聯(lián)系我們 | 網(wǎng)站地圖 | 免責(zé)聲明 | 幫助中心 | 友情鏈接 |

      Copyright © 2025 kmw.cc Inc. All Rights Reserved. 昆明網(wǎng) 版權(quán)所有
      ICP備06013414號-3 公安備 42010502001045

      主站蜘蛛池模板: 国产成人无码AV在线播放无广告| 精品无码国产污污污免费| 无码夫の前で人妻を犯す中字| 无码视频在线观看| av大片在线无码免费| 亚洲AV无码成人精品区大在线| 中文有无人妻vs无码人妻激烈| 精品亚洲AV无码一区二区三区| 亚洲精品无码久久久久AV麻豆| 无码精品A∨在线观看中文| 国产成人综合日韩精品无码| 人妻丰满熟妇av无码区不卡| 国产羞羞的视频在线观看 国产一级无码视频在线 | 极品粉嫩嫩模大尺度无码视频 | 无码人妻熟妇AV又粗又大| 无码综合天天久久综合网| 无码人妻精品一区二区三区66| 一级毛片中出无码| 特级小箩利无码毛片| 无码专区久久综合久中文字幕| 国产成人AV无码精品| 亚洲AV无码一区二区三区牲色| 色综合AV综合无码综合网站| 一本无码中文字幕在线观| 爽到高潮无码视频在线观看| 亚洲精品无码中文久久字幕| 亚洲2022国产成人精品无码区| 四虎影视无码永久免费| 日韩精品无码免费视频| 日韩夜夜高潮夜夜爽无码| 性色AV蜜臀AV人妻无码| 亚洲综合一区无码精品| 日韩人妻无码精品久久久不卡| 亚洲AV永久无码精品成人| 国产免费无码AV片在线观看不卡| 国产成年无码久久久免费| 久久久久亚洲AV无码专区网站 | 精品久久久无码人妻字幂| 亚洲日韩av无码中文| 亚洲AV无码一区二区三区网址| 加勒比无码一区二区三区|