Update: Code is present at github/sairam/daata-portal
The product is aimed as a tool for developers so that they can store arbitrary information like partial extracts from logs or log files like an s3, but hosted internally which need not scale. The aim is to provide a tool where the complete company can share data as well as information.
Running commands on all machines is the standard thing, but capturing the output and cleaning that up is usually a matter of making scripts to clean up the data especially when you are debugging during downtime of your service.
I have talked to 10 of my developer/devops friends some of whom liked and appreciated the usecase while some others were motivated to write a tool for their specific need and maintaining it.
My idea of the API looks like a regular pastebin with apps. The tool should be hosted with authentication or within a company ecosystem.
daata is the simplest name I came up with signifying data and has a
.xyz domain too
Some of the features that are currently present:
- Upload static files
- Host text files
- Host flame graphs from your code
- Upload a zip file to extract the files
- This is ideal during the phase of a build to host your documentation
- Host website HTML mocks from designers
- Host static websites/pages/single page apps
- bitly like Redirection of urls useful when sharing links or urls with teams
- track simple metrics with time/key/value into a graph providing insights
Update: Code is open source and present at github/sairam/daata-portal
- Proxying/Mocking HTTP requests
- Catch and respond to http requests in local/staging systems like SMS/Email
- Mock requests in local environments from production data
- Proxy requests through the service to capture/debug information
- Replay requests from caught requests to one or more services
- Features like
ngrokto proxy local connections to a central setup
- Display logfile stats like Kibana to display information
- Pluggable modules that can be linked
- Sharing files within the network/intranet (p2p sharing)
Started the analysis in early September 2016 and coding started on Sep 15 2016, it went a bit slow since I am new to the language and the other languages that I have worked on continue to influence me on how to do.
P.S. do not take time finding good names for your projects, you can change them later anyways. I have spent more brain cycles trying to find a name than code it up or market it