join
donate

GSoC 2015 ntp.conf web generator - Parth

Website Design

According to the requirement, the ntp.conf parsing will need to be done in the background aync, so as not to interfere with the main server threads.

A suggested method is to make use of celery. A distributed background task queue which can be allocated a task to carry out. The celery requires a backend, and a backend of Redis is suggested to be used for this purpose, as it is an in-memory database, and all temporary parsing data is cleared out.

The post-parsed data will be dumped to a file, which is served via the server directly, without interacting with the longer and time intensive Django backend. The file instance is created when the data is sent to be parsed, but should not be viewed / shown until the status of the Uploaded object is not marked as so.

The actual parser is then free to be any set of python parsing library backend on which even cpu intensive operations can be carried out. This layout allows for the complete decouple of the parser and the front-end logic; and also allows for the seperation of the parser workers onto a different device altogether.

As for the front-end itself, the ace JS library would be extensively used to mark the errors and suggestions. The mentioned REST API allows for a completly different frontend to be made so as long as it follows the REST conventions. This allows cli via curl.

Comments on this design are appriciated.

Summary

Abstract

The ntp.conf comes pre-shipped with the package ntp, and configures the parameters for the ntp daemon, which helps it sync with the right servers in the right way. However, the ntp.conf is not always written in a way that is correct / efficient.

Hence, I propose to make a web-based ntp.conf web-portal that will parse an uploaded ntp.conf and generate a clean ntp.conf based upon a choice based system.

Proposed Implementation

I propose to make a REST api variant for the implementation of this project. The server would be receiving inputted files. The files would then be matched to a set of rules. That would result in formation of a JSON object. This JSON would then be sent to a client parser to display, and also allow a re-check call.

The Parsed Object

This will be a tree similar to a DOM tree.
The proposed elements are
-- 1.) The entire file. Broken up and indexed line by line with corrections inplace.
-- 2.) A 'errors' tree, containing a list of all the errors detected with the corresponding line numbers.
-- 3.) A 'suggestions' object. Similar to the 'errors' object, but each 'error' contains a link to the API call which fetches up the related HTML. This is shown to the user as a suggestion.

The 'Error' Detection

The 'admin' of the error detection portal will be granted a web-based UI where the rules for parsing of the file can be added.

This will be split off into three categories. in
-- 1.) Critical. If the uploaded file did not match any of these rules, the returned JSON tree's error would point out the fact, as well as politely fail, suggesting a change.
For example not providing a single server directive would result in an error.
These are the errors for which nothing can be done to 'correct' the file. And are the usual case for spam prevention. (Maybe: Too many Critical would ban the uploader)

-- 2.) Error. For each of the lines, the secondary parse will take place line by line. syntax-errors, improper formatting of text, using wrong properties would result in this.

For example,

driftfile pool.ntp.org
server /tmp/somefile

would cause an error on both the lines. The client parsing this would see this as an error and ask the user to correct the file.

3.) Suggestion. For each of the lines, the suggestions would be sent based upon the rules implemented.

For example,

server time.iiit.ac.in
server pool.ntp.org
server 0.ubuntu.pool.ntp.org

the above would generate the following suggestion.

line 1 : consider adding iburst directive to server
line 2 : pool.ntp.org should be as a fallback, so it would be better to put this lower. As better servers might exist.
line 0 : consider using restrict 127.0.0.1 directive to prevent external inspections.

The 'Rules'

The rules would be a simple selection of if-then statements. An example 'rule' would look like.

if <line> <contains <"cat"> then <all cases> <raise error> error "I don't trust cats"

if <file> <not contains> <"server"> then <all cases> <raise critical error> error "Not Syncing with anything!"

any {
if <word + 0> <contains> <"server"> then <word + 1> <must be "HOSTNAME">
if <word + 0> <contains> <"server"> then <word + 1> <must be "IP Address">
}

The rules would be provided with a drop down menu for ease of customization. With text only for errors.
The proposed rules have a format of

if $1 $2 $3 then $4 $5 error $6

$1 -> lists the scope of the rule.
-- <file> then the entire file is treated as a single line, and parsing is done on it as whole.
-- <line> then the scope of the rule is on the current line
-- <word [ + n] > then the scope of the rule is on every n'th word of the line. If the word is undefined, the rule is skipped for that line.

$2 -> filter characteristic
-- <contains> then the next sequence must contain
-- <is> then the next sequence is exactly matched

$3 -> filter
-- <"X"> the variable X is the exact string which is applied to the $2

$4 -> action prefix
-- <all cases> does the next action regardless. $5 would be some action.
-- $1 all options same as those in the $1

$5 -> action
do an exception raise-able action. proposed actions are
-- <must be "TYPE"> checks the action prefix to be exactly matching the TYPE. I will be providing some default TYPEs like IP, host, filepath. This will try to match them to the TYPE and raise an exception if not matched. The exception error would be as in $6
-- <contains "">
-- <is "">
-- <suggest>

$6 -> message
-- it is a string which will be the error message / suggestion for which the failed exception if any or the suggestion will be launched.

all & any -> special directives, which tell the parser to consider special modifications

-- if all rules in the all block pass, only then pass, or else raise the exception first raised with the exception text in text.

-- if any of the rules in the block pass, then continue, or raise the last known exception which was raised.

Proposed Implementation of the Rules

An noSQL backed database would be very easy to use in such a dynamic query type. However, the rules data also appears to be ordered enough that a mariaDB with an in-memory mirrored database would offer a better performance. (In my experience).

The parser would read the rules list, and apply all the rules. For critical errors, it would stop there simply. For other errors, it would trug forward until the end.

Then the list of raised exceptions as well as the line numbers for the exceptions would be generated. This would then be sent to a front-end-parser.

Alternate the 'Rules'

A python / bash / c-program / generic executable that would run with the uploaded ntp.conf as an input and generate the JSON. This executable method would give immense power to the programmer and completely eliminate the need for any of the complex database dependent complexities. But that would make the back-end less flexible.

In my opinion, the 'rules' are not subject to much change, so for performance and raw power, it would be better to use the alternative 'rules' proposed model.

The Run-time

I wish to modularize the front-end and the back ends so that the interface between happens only via the built REST api, and for all other tasks, they might as well be hosted on opposite ends of the planet. The front-end would insert the code into the database and would long poll on it until a ready is received. Then it displays the errors and suggestions.

As for the backend, it makes a REST get and gets a ntp.conf to parse.
It reads this output and on successful parse, it would POST a url of the generated JSON in the API. The actual parsing would be done in the back-end.

Notes

  • HMS: I think we will want to have this emit rlimit memlock 0 by default.
  • HMS: The user should not have to provide a file for input. I was thinking that there might be a screen that defaults to whatever we want, and offers the user the ability to upload a file. In that case, we'd parse the file, and if the parse was successful we'd then import those "directives" and adjust the input screen selections accordingly.
  • PLK: Both of these are not too difficult to do... Default case. (When nothing found), would be to dump a blank error JSON, and a complete suggestions JSON. The tricky part still is the multi-language feature. (or atleast a api to translate).


Questions To Adderess

- will there be any refclocks attached to this machine?

- will this machine be serving time to any other machines?

Tasks

  • -- Start Working on the NTP Conf Web Generator Backend. -- Decided to go with a backend python backed with celery.
  • -- Fix a API for returned JSONs -- API fixed. Same as above.
  • xxx Building Front-End for the view and import of ntp.conf. -- A large chunk is ready. A large chunk is pending.

Timeline

DateSorted ascending Task Description % Done
04-27 Community Bonding Students get to know mentors, read documentation, get up to speed to begin working on their projects. choice-yes
05-25 Coding Begins Students begin coding for their GSoC projects  
06-26 BO Midterm Evals Mentors and students can begin submitting mid-term evaluations.  
07-03 EO Midterm Evals Mid-term evaluations deadline.  
07-06 BO Away Time Family Timeout  
07-27 EO Away Time I'm Back  
08-17 Wrap-up Suggested "Pencils Down" date. Take a week to scrub code, write tests, improve documentation, etc.  
08-21 Firm "Pencils Down" Mentors, students and organization administrators can begin submitting final evaluations to Google.  
08-28 Final Evaluation Final Evaluations Deadline  
08-28 Code Samples Students begin uploading code samples  
08-31 Final Results Final Results Announced  

Discussion and Comments

 


This topic: Dev > WebHome > GoogleSummerOfCode > GSoC2015NtpConfWebGenerator
Topic revision: r13 - 2015-06-29 - 14:40:04 - ParthKolekar
 
SSL security by CAcert
Get the CAcert Root Certificate
This site is powered by the TWiki collaboration platform
IPv6 Ready
Copyright & 1999-2020 by the contributing authors. All material on this collaboration platform is the property of the contributing authors. Ideas, requests, problems regarding the site? Send feedback